For example, if you forgot to check unique attribute to a DB column and later you realised that you got some duplicate entries. How will you fix this? Download the DB, fix it and re-upload or apply some hot fix or something else?
It's called a data migration (look it up). Basically, you will:
Some frameworks often offer "features" to help you with data migrations. There are tools to help you execute the code remotely (like fabric in python, or capistrano in ruby), but you could do it manually via ssh as well.
It's probably good practice you commit those data migrations (to have a history of them), but beware they will "rot" in no time, and could be dangerous if executed twice. So, this is a personal opinion: after everything went ok, you should delete them.
I tend to "live on the edge" and apply a hot fix directly in the database, assuming it's a simple fix. This isn't because of ego or anything, but because I'm confident in my skills to be careful in the database and time is usually of the essence.
if it's a more complicated solution, I'll mimic the data in my dev environment, create my solution, test the hell out of it, then apply it to production
We once had a db table that had thousands of records and it was already in prod when we realised, after testing, that some of the entries were strange upper/lowercase combos, e.g. sTranGe... So we implemented a JS 'toupper' method for that field on the form instead of making changes to the db.
Marco Alka
Software Engineer, Technical Consultant & Mentor
Depends on the number of changes which happen per time-unit. Sometimes, you cannot take the database offline, because you would miss many queries, and you might run into some bad problems if you try to just run them on the corrected database later on. So here are your options, imho: