I’m trying to design a turing machine that given a number in base 10 multiplies it by 2.
The problem seems trivial if the number is represented in binary so what I’ve thought is try to convert it from base 10 to base 2, multiply it by 2 and convert it again.
But I’m not quite sure if I’m taking the wrong path or not. Maybe there is a simpler way. Any hints or suggestions?