optimization: What is the correct term / theory for the prediction of binary variables based on their continuous value?

I am working with a linear programming problem in which we have around 3500 binary variables. Typically, IBM's Cplex takes about 72 hours to achieve a goal with a gap of around 15-20% with the best limit. In the solution, we get about 85-90 binaries that have a value of 1 and others are zero. The target value is around 20 to 30 million. I have created an algorithm in which I am predicting (fixing its values) 35 binary (with the value of 1) and letting the rest solve through the Cplex. This has reduced the time to get the same goal to around 24 hours (the best limit is slightly compromised). I have tried this approach with the other (same type of problems) and it also worked with them. I call this approach "Probabilistic Prediction," but I do not know what the standard term for mathematics is.

The algorithm is shown below:

Let y = ContinousObjective (AllBinariesSet);
WriteValuesOfTheContinousSolution ();
Let it count = 0;
Let processingbinaries = EmptySet;
While < 35 )
{
Let maxBinary =AllBinariesSet.ExceptWith(processedJourneys).Max();//Having Maximum Value between 0 & 1 (usually lesser than 0.6)            
processedJourneys.Add(maxBinary);
maxBinary=1;
Let z = y;
y = ContinousObjective(AllBinariesSet);
if (z > and + 50000)
{
// Reset maxBinary
maxBinary.LowerBound = 0;
maxBinary.UpperBound = 1;
y = z;
}
plus
{
WriteValuesOfTheContinousSolution ();
account = account + 1;
}
}

According to me, it is working because the matrix of solutions is very scarce and there are too many good solutions.