How to update multiple DataFrame columns in Spark Scala

I have DataFrame with columns (A, B, C, D, E). There are some rows with incorrect values ​​for C and D. I have another map that has correct information (A -> (C, D)). How to correct values ​​for columns C and D?

I know we can use the Column method to update the value of a column. So I used withColumn twice to update two columns.

fixCUdf (A: Long, C: Long): Long = {
if (newValuesMap.contains (A))
newValuesMap (A) ._ 1
plus
do
}

fixDUdf (A: Long, D: Long): Long = {
if (newValuesMap.contains (A))
newValuesMap (A) ._ 2
plus
re
}

dataFrame.withColumn ("C", fixCUdf (col ("A"), col ("C")))

dataFrame.withColumn ("D", fixCUdf (col ("A"), col ("D")))

Is there a better way to do this? Where I do not have to call fixXUdf several times.