LFS Migration

Export and import projects and repositories

On this page

Still need help?

The Atlassian Community is here for you.

Ask the community

Git LFS data and objects aren't migrated using the Data Center Migration tool, and/or cloned or re-pushed manually.

LFS enabled is a per-repository setting. This means that after the import, if you have it enabled on the source, LFS enabled will be enabled on the target.

With LFS migration, only LFS files from your main branch will be migrated. If other branches in the same repository also contain LFS files, you'll need to migrate these files manually, branch by branch.

Before you start

The LFS migration process requires read access to all affected repositories on the source instance and write access to the same repositories on the target instance. 

Preparing the export

Export outputs generated when using Data Center Management don't contain LFS objects.  This process is required for LFS objects only.

Clones can be done before, during, or after the export. If you discard the source after exporting, your export archive does not contain the LFS objects. These exist only in the source or in clones that you've made separately.

To prepare for migrating LFS objects you will need Git LFS installed on the client. Refer to Git Large File Storage for the process if you are unsure.

1. Clone the relevant repositories that include the LFS objects you want to migrate:

  • For each Git LFS enabled repository, a clone has to be created on a machine that can access both the source and target instance of a migration.
  • Otherwise, clone from the source instance first, and transfer the clone later.
  • You may have existing clones as part of your development workflow that you can use. If you can't or don't want to rely on existing clones, it's best to create dedicated clones for the migration process.

To migrate the full history of LFS objects, you need full clones: use  git clone $source_repo_url for each LFS-enabled repository.

If a recent history is sufficient, you can use a shallow clone:  git clone $source_repo_url --depth 1 or similar, for each LFS-enabled repository.

2. Run git lfs fetch --all on each individual clone to retrieve the LFS objects.

Exporting LFS objects/data

You can find the full export process here if needed.

1. Export relevant repositories using the existing data center migration process.

2. Maintain respective clones with LFS objects separately, as described above.

Importing LFS objects/data

Separate clones are created prior to export as only the setting that enables LFC will be restored, not the objects.   

1. Import relevant repositories using the existing data center import process. These repositories will have LFS enabled as when they were exported.

2. Once the import is completed, add the new origin to each of your clones: git remote add $target_instance_name $target_repo_url.

3. To complete the process, push LFS objects: git lfs push --all $target_instance_name.

Last modified on Sep 24, 2024

Was this helpful?

Yes
No
Provide feedback about this article
Powered by Confluence and Scroll Viewport.