Google change of address tool from to .com resulted in huge derank

December 2020 I moved a site from a to a .com. No other changes, simple 301 redirect and use of the tool in google webmaster tools. The site had existed since 2006 and was the de facto site in the vertical, meaning I had plenty of P1s on google for medium tail keywords.

Now, 2.5 months later I am still hugely deranked for 10,000s of key words, and as a result I’ve dropped 40% of my traffic.

I was wondering whether anyone had any advice. Obviously this is a huge issue for me as it’s my livelihood I’ve built up over 15 years of hard work. I followed googles direction perfectly (my background is actually web dev and SEO) and it’s still deranked me.

I’m not sure the linking policy here, but the site is tyrereviews dot com, it used to be tyrereviews dot couk, and for example we used to rank 1 or 2 for the search “michelin primacy 4” in google UK however we are now page 2.

The new .com domain is targeted properly to the UK in webmaster tools too.

This is the same for many many medium tail keywords.

mysql – Does a huge key length value for a mulibyte column affect the index performance?

When I look at the EXPLAIN results, the key len value is always calculated based on the actual column length multiplied on the maximum number of bytes for the chosen encoding. Say, for a varchar(64) using utf8 encoding the key len is 192.

Does this number affect performance in any way and should I reduce it when possible? I mean, does it make MySQL to reserve some space somewhere that remain unused, or it’s just a maximum possible value while the used space is based on the exact data length?

So the actual question is: if I have a column that contains only Latin letters and numbers, should I change its encoding to latin1 from utf8 in regard of the space occupied by the index/overall index performance?

investing – Need Help Buying huge amount of Bitcoins

I want to invest in Bitcoin, have program to buy 200-350 Bitcoins per week,
Could you please guide me how to buy it from a valid and reliable person?
Could you introduce me someone has more than 80 Bitcoins in her wallet to buy? I can pay your brokering commission.
Please email me if you could help
, Thanks

Huge problems with Host Slayer LLC

Huge Problems with !

1] I have been unable to access the Control Panel for almost 1 year.
2] I cannot remove m… | Read the rest of

[FREE] Graphics Kit – HUGE Bundle of Conversion Exploding Graphics and Templates

[FREE] Graphics Kit – HUGE Bundle of Conversion Exploding Graphics and Templates

Sales page :


Mediafire download:

[FREE] N-I-N-J-A Graphics Kit – Huge Bundle of Conversion Exploding Graphics and Templates

[FREE] N-I-N-J-A Graphics Kit – Huge Bundle of Conversion Exploding Graphics and Templates


Sales Page:


Download Page:

FREE 1.4 FARM COIN. $750 already listed on major exchanges πŸš€ πŸš€ Huge predictions. Apply in telegram app. THIS ONE IS HUGE πŸͺ§ πŸ’¨ πŸš€

FREE 1.4 FARM COIN. $750 already listed on major exchanges πŸš€ πŸš€ Huge predictions. Apply in telegram app. THIS ONE IS HUGE πŸͺ§ πŸ’¨ πŸš€

unity – How can I efficiently load huge volumes of star systems?

The trick Elite likely uses is that they don’t pre-generate the whole galaxy and store it in a database. They likely generate most of the galaxy at runtime when it is needed.

I would do this using a pseudorandom but deterministic algorithm which can generate the properties of every object in the galaxy at runtime just from its position.

So when a player zooms into a section of the galaxy, then the galaxy chunk generation algorithm is run, which takes the chunk coordinates as input and outputs a list of stars with position, color and size. Same input always results in the same output, so when another player zooms into the same chunk later, they get the same results. You might have different algorithms for different zoom levels which each take the output of the previous algorithm into account and add more detail to it. So the algorithm on the lowest zoom factor only generates the largest stars (so you can quickly generate a view which shows the whole galaxy at once), and the closer the player zooms into any part of the galaxy, the more additional small stars get generated in that area.

Then, when the player clicks on any of these stars to zoom into its star system, the star system generation algorithm is run. Its input are color, size and galactic position of the star. Its output is a list of planets with their types, sizes and orbital parameters. It too is a deterministic algorithm so it always generates the same planets for the same star.

And then you can do the same thing with planet surfaces, cities on the planet surfaces, houses in those cities and rooms in those houses. So you end up with a galaxy with a level of detail which would take an exorbitant amount of data to store all at once. But you don’t need to store it all, because any of that data can be re-calculated on demand.

A really neat tool for procedural generation algorithms like that are noise pattern algorithms like Simplex Noise or Worley Noise. You can sample them at arbitrary locations to get reproducible results. Another are standard pseudorandom number generators which can be initialized with a seed value and then always generate the same sequence of numbers for the same seed value.

All you really need to store in a database is data which can not actually be re-generated on demand:

  • Parts of the galaxy which you want to design by hand
  • Changes to the galaxy which are the result of player actions

First you check if any such datasets exist in your database for the requested data, and when they don’t, you generate the data using the algorithms.


Now you just need to come up with algorithms which generate interesting and varied results and then with game mechanics which provide an engaging and interesting game experience which benefits from all that content variety. I am looking forward to playing what you will come up with.

web crawlers – Huge amount of bot requests making django session table explode

I deployed my website online on a .com domain, and since receive periodically huge amounts of requests with the same pattern. They all target my main page with some GET parameters like e.g.:

where 4893 is incremented on each request.



I have no video on my site, and these requests are totally unknown and unrelated to my site.

Here is an example of log (heroku server):

2021-02-10T19:05:10.510705+00:00 heroku(router): at=info method=GET path="/en/?m=vod-play-id-4893-src-1-num-1.html" request_id=566b1a6d-5168-4c67-acdb-77a862d22217 fwd="2607:5300:60:3b5d::1," dyno=web.1 connect=0ms service=104ms status=200 bytes=20467 protocol=http

I have no experience with such things and no idea what they are looking for.

These requests don’t make the website slower but add thousands of lines in my django session table every day, and I have to purge very often.

Any way to avoid these requests to result in the creation of a django session?

Huge Bonuses




There are 3 income streams
1. Investment 1.4% to 4.2% daily (Monday to friday)
2. 15 level unilevel get paid on all partners investments and re-investments (this is passive)
3. Structural turnover bonuses starting from $1000 to $3 000 000 this is crazy good.
You will love beurax!!!!


Check out for more information