Keeping the values in a development database in sync while multiple developers are changing the database structure can sometimes be a bit of a challenge.
For example, recently we needed to ensure that all emails in all database tables are replaced with fresh values. It is not a problem when working with a single table, for example aspnet_Membership. It is much more interesting when you have ten or even more tables containing the email field, they are frequently changed and you need to be sure you covered all those tables. That’s the case when you might need an alternative, but still simple solution.
The basic idea is to get all table fields containing the keyword ‘mail’ in their name, have “string-based” field types(char, nchar, nvarchar…) and save it to the temporary table.
INSERT INTO #column(columnCounter, objectId, schemaName, tableName, columnName) SELECT ROW_NUMBER() OVER(ORDER BY C.name), O.object_id, S.NAME, O.name, C.name FROM sys.columns C INNER JOIN sys.types T ON C.SYSTEM_TYPE_ID = T.SYSTEM_TYPE_ID INNER JOIN sys.objects O ON C.object_id = O.object_id INNER JOIN sys.schemas S ON O.schema_id = S.schema_id WHERE T.Name IN ('char','nchar','nvarchar','varchar','text','ntext') AND O.type = 'u' AND (C.name LIKE('%MAIL'))
Our next step is to navigate through all those columns and update their values with the new data.
WHILE @counter <= @columnsCount BEGIN DECLARE @currentTableName NVARCHAR(100) DECLARE @currentColumnName NVARCHAR(100) SET @currentTableName = (SELECT tableName FROM #column WHERE columnCounter = @counter) SET @currentColumnName = (SELECT columnName FROM #column WHERE columnCounter = @counter) SET @sqlCommand = 'UPDATE ' + @currentTableName + ' SET ' + @currentColumnName + ' = ' + '''' + @newEmailValue + '''' EXEC(@sqlCommand) print @sqlcommand PRINT Convert(varchar(10), @@ROWCOUNT ) + ' row(s) affected in ' + @currentTableName + '.' + @currentColumnName SET @counter = @counter + 1 END
It is strongly recommended to make a backup of your database because all email fields will be replaced with the data you specify in your script. I’ve tested it on the database with 300 + tables and the execution time was quite fast, about 3 seconds. Please find a complete script attached to this post.