I am perplexed by the following.
I have data on a google spreadsheet that I am pulling into an array. I am doing some modifications on the data and then putting it back into the spreadsheet. My numbers are formatted like this on the spreadsheet: 00.00% When they come into the array they are formatted like this 0.0, and when I put them back on the spreadsheet into 00.00% format, they are being divided by 100. So what started out as 20.00% becomes 0.20% when I put the data back.
So, I thought I'd multiply all the numerical values by 100 before I put them back on the sheet, which seems like it should be an unnecessary step.
My question is whether I could be doing something differently with how I'm handing the data in the array, or is multiplying every value by 100 my only option here?