I'm using a web service to pull some dates from a web server... which pulls the information from a separate SQL server. The values I was getting were off by one hour, which makes perfect sense since the server is located in Eastern timezone and I'm in Central. So I implemented code within the web service to convert the date from the server timezone to GMT and at the destination, convert the GMT date to the local timezone. I appeared to be working.
I then implemented some code on the web server (the same one hosting the web service) to display similar information... in addition, showing the timezone of the machine as a point of reference for the data being displayed. It turned out the server was set incorrectly. I fixed this.
Can you tell me what happened after that?
OK, so a couple of days go by and I'm working on the project again. Now the data being displayed is off by an hour on the application. ????? So I go through and verify that the timezone code is working properly. Then it hit me... I wonder if web services automatically takes into account timezone differences for values passed as a date?
I removed the code previously added to the web service to convert the dates to GMT and the code in the client to convert back to local time. Sure enough, the timezone information is automatically applied accordingly. Of course, this makes perfect sense... but because my server was incorrectly configured it lead me to believe that I needed to handle it myself.
Although I couldn't find any documentation that specifically outlines this behavior, I did come across this which mentions DataSets behaving the same way.
Not that I'm really trying to make any sort of point here, just figured I'd mention it. Of course I'm laughing at myself for not looking up how date values are handled via web services in the first place ;-)