seo – Four 301-redirects from “www.example.com” to “shop.example.com” (am I losing my Google PageRank?)

Well, its been 2 years so urgent is relative, but yes, you want to fix this as each redirect is believed to come with a small loss of “google juice”.

Matt Cutts (a google official rep) talked about it being unlikely that Google will follow 4 or more redirects – although that was a while ago (see https://youtu.be/r1lVPrYoBkA?t=165 from about 2:45)

There are multiple ways to set up a redirect in Apache. The easiest one in your code might be – in the Apache config or .htaccess

  Redirect permanent / https://shop.example.com/

(Ref: https://httpd.apache.org/docs/current/mod/mod_alias.html#redirectpermanent)

seo – Four 301-redirects from ‘www.myproject.com’ to ‘shop.myproject.com’ (am I losing my Google PageRank?)

Some years ago, I started my website with the main domain ‘www.myproject.com’. However, the project ended up being a store under ‘shop.myproject.com’.

I obtained hundreds of links to ‘http://www.myproject.com’, and today (after two years) I found that the redirection is not well-constructed, because there are four 301-redirects from ‘http://www.myproject.com’ to ‘https://shop.myproject.com’.

I have two questions:

  1. Should I fix urgently this issue of “too many 301-redirects” in order not to loose my Google PageRank?
  2. Is there any quick Rewrite Rule for Apache 301-redirect from ‘http://www.myproject.com’ to ‘https://shop.myproject.com’?

google – PageRank: will links pointing to pages protected by robots.txt still count?

There are three schools of thought on this.

a. Yes, page rank will pass to the robtos.txt blocked page, it will be lost, find a way not to do it.

b. No, It’s an internal link. The way Page Rank flows around a site prevents it being lost to pages that are banned by robots.txt.

c. John Mueller’s position(he has actually commented on this thread) that no it won’t impact you, but then he goes and muddies the waters saying you’d be better off working on your content instead. Since tech and content teams are able to work in parallel which he well knows, this reasoning is a strawman and not at all useful. It’s impossible to tell if he means the impact is so small, focus elsewhere, or that internal linking to pages blocked by robots txt has zero impact.

He has also gone on the public record saying most of what you read on this topic is dated, wrong etc…so who knows if what he said was true in the first place, or true now?

I don’t know which is right – so I

a. Assume A is correct. I benefit it its right, I cause no harm if it’s wrong.
b. I assume B is incorrect. I benefit if its wrong, cause no harm if its right.
c. I assume John is not going to give a straight answer that ultimately closes the subject. He has stated his opinion about where time might be best spent, but does not absolutely close the door on no benefit passing from stopping link to blocked by robots.txt pages.

So – I don’t link to pages blocked by robots.txt wherever possible.

Also not helpful as he talks about using this to stop duplicate content when its more often used to stop faceted content being indexed, a subset of the main content but not duplicate.

data mining – PageRank vector

I computed the PageRank vector for the example given in https://en.wikipedia.org/wiki/PageRank (where the picture shows that node B ends up with a score of 38.4, node C with 34.3, node D with 3.9). I implemented the PageRank algorithm, but my numbers are slightly different: 39.8 for node B, and 36.1 for node C, 3.5 for node D, etc). I was wondering if anyone could simulate and obtain the same results they have. My question is what algorithm was used to obtain their numbers.

My algorithm is as follows. Starting with the uniform distribution $r$, I did power iteration using the equation $r = Ar$, where $A = 0.85 M + 0.15 J$, $M$ is the transition matrix of the Web graph given in the example, and $J$ is the matrix whose every entry is $1/N$ ($N=11$ is the number of nodes).

I will create 500 Do-follow High PR4-PR7 backlings for $5

I will create 500 Do-follow High PR4-PR7 backlings

 will submit 500 Do-Follow backlings

Do follow is an internet term whict is applied to pages on the web that are using that do-follow attribute. As apposed to the do-follow.No-follow means that a search engine such as google will not pass on the benfits of a certain type of hyperlink. 

  BENEFITS OF MY SERVICE

  

• I am very fast work

• 100% safe

• 100% hard work in my orders

• only 2 days

• My work is very perfect

.

pagerank – Does rel=”nofollow” link helps in SEO

I use both dofollow and nofollow in my SEO campaign. Both have value but should be used differently. Let’s say I have a Facebook Fan Page with a keyword/phrase I want to rank for. There is no way that Facebook will help push my website up on the SERPs when it comes to Google juice. However, in a round about way it does! Facebook has a pagerank of 10 and is nofollow. So no juice. But if promoted correctly that Facebook Fan Page will drive traffic to your website because it will rank high in the SERPs.

It is not that hard to have a Facebook Fan Page rank well in the SERPs quickly. This means that you will have a presence on that page targeting your keywords. That presence will be reflected in more traffic. That traffic will help push your website/blog high in the SERPs.

On the other hand, you could create and develop a blog on Wordpres.com which is a dofollow and has a pagerank of 9. If you build unique and interesting content on that blog with a link to your website you will get Google Juice from that link. As with the Facebook Fan Page mentioned above it is possible for your WordPress blog to grab one of the positions on the first page of the SERPs.

In essence, you may have 3 different properties pop up on the 1st page!

Many people do not understand how powerful blogs, Facebook, Twitter and Tumblr can be. If done correctly you can dominate the SERPs for your keyword. Note that some keywords are so competitive that it could take a long time.

In sum, both dofollow and nofollow have their own uses. Lastly you can have a dofollow pushing juice to either a nofollow or another dofollow property. One you place a nofollow into the mix the juice stops flowing from one property to another.

Will Create 500 High Qualiy PR4-PR7 Tracking Links For $ 5

will create 500 high quality tracking links PR4-PR7

I will create 500 high quality tracking links through directory submission. This will help increase your website and increase your traffic, which will generate more income. Tracking backlinks are the only way to increase your Google page rank and it is very important to improve your Google rank. I am providing high quality SEO services of contextual tracking backlinks from tracking. Increase the number of positive actions on your website when it first appears on Google!

What you will get:

  • High-quality tracking contextual links
  • SEO links
  • I like them are 100% safe for penguins and pandas

.

sitemap – Is there a SEO or pagerank benefit in grouping (rel = "alternate") of multilingual websites (using multiple domains) together?

I am building this website that will be multi-region / multi-language.

And for that, Google suggests the following:

https://support.google.com/webmasters/answer/182192?hl=en

enter the image description here

As my domain will have to change according to the language of the country, I will have to go with Option 1.

A country-specific domain.

Example:

  • www.name-in-english.net
  • www.name-in-deutsch.net
  • www.name-in-spanish.net
  • www.name-in-french.net

Since they all have different domains, if I don't let Google know that they are the same website, they will all be treated as unique unique websites.

Google suggests that you can "group" them using one of those 3 options:

https://support.google.com/webmasters/answer/189077

enter the image description here

If I'm doing it, I'll use the Sitemap option.

But my question is:

What do I gain by letting Google know that all of these websites are the same, but in different languages? What will be different if I let Google think that they are all unique and separate websites?

Do I earn some SEO or pagerank doing that?

I mean, what if I get a lot of traffic from my website in English? Does that benefit my results from other languages ​​in the other domains?

Rank the first page of your guaranteed Google website for $ 110

Rank your website on the first page of guaranteed Google

Do you want SKYROCKET YOUR RANKINGS and be FIRST on google? Look no further, you are in the right place now!Guaranteed results or full refund, why not try?With my skills and knowledge I can classify you on the first page of google. I am committed to providing exceptional results and incredible service. My SEO strategy is updated to improve the ranking of your website and shoot it on the first page of google. I have already classified thousands of websites using my high authority backlinks! Order now and SKYROCKET YOUR RANKINGS!

.

Pagerank: Is a low domain authority bad for backlinks?

I know that a "domain authority" rating is a bit subjective and different engines may have different parameters.

But let's say I have a newly created website for my aunt. It is new and has NO authority.

On a page like moz.com, you have the poor authority of 0.

Two questions:

  1. Is it wrong to put a backlink on my aunt's website pointing to my website: "Website designed by my dear nephew example.com"? It could be simply neutral, not good but not bad.

  2. But then the second question comes, imagine that some backlinks from websites with spam may be attacking my website. Is there any way to find harmful websites with some kind of negative domain authority?

Related questions: Is the page authority or domain authority more important for backlinks?

How to identify spam domains that provide backlinks to my site (to send rejection links in WMT) (but the links in this answer are no longer active)