This is it! We've made it to the import step! This is when we finally move our data into Azure DevOps Service.
If you missed the earlier posts, start here.
I highly recommend Microsoft’s Azure DevOps Service Migration Guide.
Detach Collection
First, you need to detach the collection from TFS. Don't detach the database in SQL Server, but detach the collection in the Azure DevOps Server.
To detach the collection, open the Azure DevOps Management Tool, go to Collections, and choose Detach on the collection that is going to be imported.
Generate the Database Backup
If you have managed to keep your import under 30 GB, this step is fairly easy. If not, you are in for a harder import because you now need to move your database to a SQL Server Database in Azure. I won't cover the SQL Server migration as I did not do this step, but here is the guide on how to do this.
So, if you are going the under 30 GB route, you need to create a DACPAC that is going to be imported to Azure DevOps Service. You should be able to run the DACPAC tool from your Developer Command Prompt for Visual Studio or from the following location:
C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150
Here is the packaging command:
SqlPackage.exe /sourceconnectionstring:”Data Source=localhost;
Initial Catalog=[COLLECTION_NAME];Integrated Security=True”
/targetFile:C:\dacpac\Tfs_DefaultCollection.dacpac
/action:extract
/p:ExtractAllTableData=true
/p:IgnoreUserLoginMappings=true
/p:IgnorePermissions=true
/p:Storage=Memory
After the packaging is completed, you will have a new DACPAC at C:\dacpac\ with all your import data.
Upload the Package
We're not going to upload the package directly into Azure DevOps Service. First, we need to upload it to Azure itself. And then we'll point Azure DevOps Service at the DACPAC in Azure.
The easiest way to do this is to install the Azure Storage Explorer.
Open the Azure Storage Explorer app.
Choose Add Azure Account.
Login to your Azure Account.
Go to Azure Storage Container.
Create a new Blob Container named DACPAC.
Upload the DACPAC file created by SqlPackage.exe.
Create the SAS Key
You need to create a secret key that will allow Azure DevOps Service to access the DACPAC.
In Azure Storage Explorer, right-click the DACPAC folder and choose Get Shared Access Signature...
Set the expiration to one week from today.
Give it read/list rights, nothing else.
Copy the URL for the SAS Key.
This SAS URL should be placed in the import.json file that was in the Logs folder from earlier. Set it in the Source.Location field.
Import
That's it! We are ready to start the import!
Run the following command from the Data Migration Tool folder:
Run Migrate import /importfile:[IMPORT-JSON-LOCATION]
The import will begin and the command will provide a link to view the status of your import.
It does take a few minutes before you can even see the import page, so don't panic.
Once the import began, it took about two hours to complete... so this is a good time to take a break.
Validation
You did it! Your migration to Azure DevOps is completed. You should now verify that everything is working correctly.
Users
First, verify your list of users. You can find your users in the Organization Settings. I had to eliminate a lot of users that did not need access to the service. You should then set the correct Access Level for your actual users. We have a number of VS Enterprise subscriptions that I used for most of my developers, and our contractors received Basic access. Most importantly, make sure all users are listed that should be.
This is a great chance to see how much Azure DevOps Service is actually going to cost you, so make sure you set this up just like your Production environment will be.
Source Control
Because you moved your GIT source control, you don't actually need to re-clone it, you can just redirect your existing local repo to the new location.
You can change your local repo origin with the following command (you can find the REMOTE_GIT_REPO in the Clone button in Azure DevOps Service - Repos - Files).
git remote set-url origin [REMOTE_GIT_REPO]
Billing
Make sure your Billing Account is configured for the service. When you do your Production migration, this is important. You won't be billed till the first of the next month, so make sure you have Billing and Users setup by the end of the month.
Build / Release Agents
Any local Build / Release agents will need to be reconfigured. I only had about 10 agents running locally, so I chose to just remove them and reinstall them after the final Production run. The Powershell command makes this very easy.
I did not test this with the Dry Run, I simply reconfigued it after the Production migration and everything worked smoothly.
Final Import
And that is it!
We had very few other issues, the dry run went well and the Production migration a few weeks later went very smoothly.
For the final migration, I simply repeated the steps of this Guide and changed the import.json to use Production instead of Dry-Run.
I turned off our local TFS server and am keeping it around but off in case we need the legacy code.
The main thing that came up after final migration was setting Permissions for Users correctly, but I simply adjusted these settings as we went.
Some users had issues with non-Visual Studio tools being unable to connect to the remote repo, but setting their GIT Credentials in Azure DevOps Service - Repos - Files - Clone fixed the issue.
I hope you have learned from my efforts and if you any questions let me know!
Top comments (0)