htaccess – How to redirect all the pages of the subdirectory to another one?

I'm not sure how this would solve "duplicate content problems", but anyway …

Redirect 302 / articles / / article /

The mod_alias Redirect directive is the prefix match, so previous matches /articles/ and redirects to /Article/ – passing the part.

This is naturally a 302 (temporary) redirect.

However, if you have other directives (particularly mod_rewrite) in your .htaccess then you may need to change this. The order is also important.

Reference:

.htaccess redirect by nameserver – Webmasters Stack Exchange

Stack exchange network

The Stack Exchange network consists of 175 question and answer communities including Stack Overflow, the largest and most reliable online community for developers to learn, share their knowledge and develop their careers.

Visit Stack Exchange

Using htaccess to redirect 301 for domain change

I am currently moving a domain from one to another, this is not an https movement, but a move from exampleone.co.uk to exampletwo.co.uk. I am using this code:









RewriteEngine On
RewriteCond% {HTTP_HOST} ^ exampleone  .co.uk $ [OR]
RewriteCond% {HTTP_HOST} ^ www  .exampleone  .co.uk $
RewriteRule (. *) $ Http://www.exampletwo.es/$1 [R=301,L]

This works well for existing page names, since the structure of the site is not changing. However, it also returns non-existent pages to the index page, basically creating a soft 404 that is not desirable.

Does anyone have any idea how to redirect the valid pages from exampleone.co.uk through 301 but returning a 404 when the page does not exist but also takes into account the valid 301 within the new domain?

So we want that:

  1. Redirect through 301 old site to new
  2. Honor valid 301 on the new site (future tests)
  3. Return 404 instead of redirecting to the index page when 301 does not exist

Any guidance would be useful.

Thank you

htaccess: try to create 1-1 redirects, exclude certain directories / files and then redirect any other traffic to a new website

It can not be mixed Rewrite Y RedirectMatch how that. You need to implement the redirects using another one. Rewrite so that it works as you want. A rewrite rule can not prevent a redirection match from working. Instead, your rules should be:

RewriteEngine on

# Exclude redirects files / directories

RewriteRule ^ (documents) ($ | /) - [L]

# Redirect files / specific pages to new locations

RewriteRule ^ page1.html $ https://example.com/new-page1.html [R=301,L]
RewriteRule ^ dir1 / page1.html $ https://example.com/random-new-page.html [R=301,L]
RewriteRule ^ documents / awesome.pdf $ https://example.com/new-webpage-to-replacepdf.html [R=301,L]

# Redirect anything else that is not up to the home page

RewriteRule. * Https://example.com/homepage [R=301,L]

Nor would I suggest using that last rule. It is not a good user experience or good for SEO to redirect everything to the home page. It is a best practice to use a rule to direct the URL to its exact corresponding counterpart:

RewriteRule (. *) Https://example.com/$1 [R=301,L]

.htaccess – Why / How could I delete index.php in the URL

I added a RewriteRule to force https in a domain at the end of my .htaccess file

RewriteCond% {HTTP_HOST} ^ www.mydomain  .be [NC]
RewriteCond% {HTTPS} disabled
RewriteRule ^ (. *) $ Https: //% {HTTP_HOST}% {REQUEST_URI}

It's working fine, except that after that, I get a /index.php/ within the entire URL

If I add RewriteEngine on before as the following:

RewriteEngine on
RewriteCond% {HTTP_HOST} ^ www.mysite  .be [NC]
RewriteCond% {HTTPS} disabled
RewriteRule ^ (. *) $ Https: //% {HTTP_HOST}% {REQUEST_URI}

Then my /index.php/ The problem is solved…

I'm happy with that, but I do not understand why.

Note: These lines are added to the end of a regular drupal .htaccess. I do not mean any changes, except that I have not commented everyone's rule to "www" (before my rule https at the end)

RewriteCond% {HTTP_HOST}.
RewriteCond% {HTTP_HOST}! ^ Www . [NC]
RewriteRule ^ http% {ENV: protossl}: //www.% {HTTP_HOST}% {REQUEST_URI} [L,R=301]

There is no special conf apache2 as well as

htaccess: how to ignore / redirect all URLs that match a given string

(I think there should be a specific form of WordPress to solve this, but anyway …)

Appropriate rel = "canonical" labels on the head section should have resolved any duplicate content problem.

If these URLs no longer exist, you could say they should return a 404. You can serve this by using mod_rewrite at the top of your .htaccess file, before any existing mod_rewrite directives:

RewriteRule ^ events / action ~ - [R=404]

If you want to send a "410 Gone" instead, change the R = 404 flag for Sun.

To redirect to / events / instead, I would change the directive to read:

RewriteRule ^ events / action ~ / events / [R=302,L]

302 is a temporary redirect, 301 for permanent. (But it was only changed to 301 once you tried it, to avoid caching problems).

However, if you have blocking directives in robots.txt then there is no bot that obeys robots.txt You will see these answers. For example, if Google has previously indexed these URLs, you should consider removing the block in robots.txt So Google can see the answer 4xx.

htaccess – Error showing TwitterCard

The twittercard has stopped showing after using the ssl certificate. Any suggestions for correction? I added the RewriteCond% {HTTP_USER_AGENT}! Twitterbot no .htaccess, but without success.

Follow the .htaccess:





Options + FollowSymLinks
RewriteEngine on
# home twitter
#RewriteBase /
RewriteCond% {HTTP_USER_AGENT}! Twitterbot
# final twitter
RewriteCond% {HTTPS}! = On
RewriteRule ^ https: //% {HTTP_HOST}% {REQUEST_URI} [L,R=301]
RewriteCond% {REQUEST_FILENAME}! -F
RewriteCond% {REQUEST_FILENAME}! -D
RewriteRule ^ (. *) $ Index.php? / $ 1 [QSA,L]

mod rewrite – htaccess redirects all traffic to a secure version of the page other than www

I'm having problems with my .htaccess redirects

I need to forward all the URLs to the non-www version of the URL using HTTPS and also forward any unsafe URL to HTTPS. In both cases I need to keep the full URL

I currently have these rules that almost work the way I want but they seem to redirect www traffic to the home page instead of keeping the rest of the URL.

e.g:

http://www.example.com/mycat/mypage.php

I would go to

https://example.com/mycat/mypage.php

My current code:

RewriteCond% {HTTP: X-Forwarded-Proto}! = Https
RewriteCond% {HTTPS}! = On
RewriteCond% {REQUEST_METHOD}! = POST
RewriteRule ^ https: //% {HTTP_HOST}% {REQUEST_URI} [L,R=301]

I tried the rules in this link, but they seem to get stuck in a redirect loop

htaccess redirects not www to www with SSL / HTTPS

RewriteCond% {HTTP_HOST}! ^ Www .
RewriteRule. * Https: //www.% {HTTP_HOST}% {REQUEST_URI} [L,R=301]
RewriteCond% {HTTPS} disabled
RewriteRule. * Https: //% {HTTP_HOST}% {REQUEST_URI} [L,R=301]

With HSTS (double redirect):

RewriteCond% {HTTPS} disabled
RewriteRule. * Https: //% {HTTP_HOST}% {REQUEST_URI} 
[L,R=301]
RewriteCond% {HTTP_HOST}! ^ Www .
RewriteRule. * Https: //www.% {HTTP_HOST}% {REQUEST_URI} [L,R=301]

apache – Could someone explain what these .htaccess rules and "well-known" conditions are for?

Why are there two sets of conditions and rules? Can they be combined in a set?

These are two rules that serve two different purposes: they can not be combined. By eliminating the conditions ("well known" (unnecessary), we have:

RewriteEngine On
RewriteCond% {SERVER_PORT} 80
RewriteRule ^ (. *) $ Https://www.example.com/$1 [R,L]

RewriteCond% {REQUEST_METHOD} = POST
RewriteCond% {HTTP_USER_AGENT} ^. * (Opera | mozilla | firefox | msie | safari). * $ [NC]
RewriteCond% {THE_REQUEST} ^[A-Z]{3,9}  /.+/ trackback /?  HTTP / [NC]
RewriteRule.? - [F,NS,L]

The first is a redirect from HTTP to HTTPS. (What really should be a permanent redirect 301, not a temporary redirect 302 as it is currently written).

The second rule would seem to block the trackbacks. This is a "feature" that allows websites / blogs (especially WordPress) to recognize link links to articles and is often shown as a commentary. However, spammers can abuse this, hence the desire to block them.

I'm confused about the purpose of well-known Exceptions and many conditions. Is this just something temporary set in motion by the host or is it the best practice to keep it in place?

These terms They have nothing to do with the rules themselves, they are simply creating exceptions so that the rules are not executed when these requests are made. This occurs when your host automatically renews security certificates.

They are only needed temporarily (if they do). However, it is likely that they will be reapplied in a few (3?) Months, when the certificates are renewed.

However, as @ Stephen points out in the comments, for these particular rules, the conditions are probably superfluous anyway. However, cPanel does not discriminate, it adds these conditions blindly before every Rewrite directive. If you delete them manually, it is likely that cPanel will add them again when it comes to certificate renewal (as mentioned).

See also my answers in the following ServerFault questions for more information on these conditions:

Could anyone explain what these .htaccess entries are for?

Why are there two sets of conditions and rules? Can they be combined in a set?

These are two rules that serve two different purposes: they can not be combined. By eliminating the conditions ("well known" (unnecessary), we have:

RewriteEngine On
RewriteCond% {SERVER_PORT} 80
RewriteRule ^ (. *) $ Https://www.example.com/$1 [R,L]

RewriteCond% {REQUEST_METHOD} = POST
RewriteCond% {HTTP_USER_AGENT} ^. * (Opera | mozilla | firefox | msie | safari). * $ [NC]
RewriteCond% {THE_REQUEST} ^[A-Z]{3,9}  /.+/ trackback /?  HTTP / [NC]
RewriteRule.? - [F,NS,L]

The first is a redirect from HTTP to HTTPS. (What really should be a permanent redirect 301, not a temporary redirect 302 as it is currently written).

The second rule would seem to block the trackbacks. This is a "feature" that allows websites / blogs (especially WordPress) to recognize link links to articles and is often shown as a commentary. However, spammers can abuse this, hence the desire to block them.

I'm confused about the purpose of well-known Exceptions and many conditions. Is this just something temporary set in motion by the host or is it the best practice to keep it in place?

These terms They have nothing to do with the rules themselves, they are simply creating exceptions so that the rules are not executed when these requests are made. This occurs when your host automatically renews security certificates.

They are only needed temporarily. However, it is likely that they will be reapplied in a few (3?) Months, when the certificates are renewed.

See also my answers in the following ServerFault questions for more information on these conditions: