Skip to end of metadata
Go to start of metadata

We do enforce the following limits to prevent Bitbucket use in a way that consumes a disproportionate amount of system resources (CPU’s, memory, disk space, bandwidth, etc.) or that would adversely impact the performance or operation of Bitbucket for other Bitbucket users. Outside of the enforced limits this page contains some general "good neighbor" suggestions for using Bitbucket.

Repository size

  • Soft limit 1 GB: We will notify you by email and with a notification bar in Bitbucket.
  • Hard limit 2 GB: We will disable your ability to push to the repository. You will be notified by email and with a notification bar in Bitbucket.

See the following table for more information about repository size and other system limits. To find the size of your repository see, Finding your repository size.

Limit typeSize or number of occurrencesReason

Repository size: the total size of your repository on Bitbucket.

To see the size of your repository see, Finding your repository size.

1 GB soft limit

We will notify you, with an information bar in Bitbucket, once your repository reaches the 1 GB mark.

Pay attention to this limit. Even though it is a soft limit it is intended to make you aware of the size of your repository, so you can take action before you approach the 2 GB hard limit which has more serious consequences.

Some actions you can take to reduce the size of your repository before you hit this limit:

If your repository is greater than 1 GB, you should consider if you are using Bitbucket correctly. Keep in mind Bitbucket is a code hosting service not a file sharing service, we offer some suggestions for binaries below

For more information about managing your repository size, see Reduce repository size.

 2 GB hard limit

We will disable the ability to push to your repository and you will have to split or reduce your repository locally, and then create a new repository and push the code to the new repository.

This limit is a "hard limit," which matches our download limit, and helps us maintain a high level of service for all our users. Git and Mercurial repositories are inefficient at these sizes so the performance you experience locally will be degraded as well as consuming more resources on our systems.

File requests: the number of times a file is actively downloaded

This is not a limit on actions such as:

  • Pushing or pulling commits
  • Running a fetch command to get recent changes
  • Pushing or pulling a branch (unless that branch contains more than 1000 new files)
5000 requests per hour This is to prevent use of Bitbucket as a Content Delivery Network (CDN) which would consume a disproportionate amount of resources. Amazon's CloudFront is a much better choice if you need to simply host your compiled project binaries.

Downloading archive .zip files

These archive files are accessed from the Downloads page.

Must be no larger than 2 GB

Creating and downloading archive files over 2 GB consumes huge amounts of processing and bandwidth. For these types of requests, we recommend taking a copy of your archive locally or using a CDN if you need to widely distribute a file.

Finding your repository size

To check the relative size of your repository in Bitbucket click Settings (1), which should open the Repository details (2) page, then look for the Size (3) line.

Git Repository Size from the Command Line

For Git, you can use the git count-objects -v command:

The size-pack value is the size of your repository when it is pushed to a remote server like Bitbucket. The size-pack value is in kilobytes.  So, in the above example the repository is not even 1 MB.  

Mercurial Repository Size from the Command Line

Mercurial does not provide a command specifically for find a repository repository size.  You can use the bundle command to generate a compression of your repository and then see the size of the file:

Have a lot of binaries such as images or sounds?

Keep in mind Bitbucket is a code hosting service not a file sharing service. If a lot of your files are extremely large or if your files are binaries or executables, you should understand Git or Mercurial will not work well with them. You'll find that even locally your repository is barely usable. Moreover, Bitbucket can't display diffs on binaries.

For binary or executable storage, we recommend you look into file hosting services  such as DropBox, rsync, rsnapshot, rdiff-backup, and so forth.  Still not sure what to do? Review this post on stackoverflow for more ideas.


  1. Anonymous

    This is why I LOVE bitbucket.

  2. No, there isn't. We try to keep our "rules" pretty reasonable and assume our users will be as well.  Most people know "unreasonable" when they see it. (big grin)


    1. Anonymous

      Hello, I intend to include all of my projects in bitbucket, has even started, I have a problem in doing this? Is there any time limit that a project can stay there? Can I delete designs from my HD?

      1. There is no limit on how long your project can be on Bitbucket.  

        You should always keep a backup of your repository on your hard drive.  Why? If you accidentally delete your repository from Bitbucket we cannot recover it for you. See this FAQ for more information: I deleted a repository or stripped commits, can you restore the data?.

    2. Anonymous


      Does Bitbucket support file sharing for files that aren't codes?

      1. Use Bitbucket if you need to store text  files that change frequently and for which you need a history of each change. You can store binary files on Bitbucket but that's only really intended for assets that support source code. For example, a product logo that shows up in your application.

        File sharing is better for binary files like images, video clips, or sound.  Use a service like Dropbox.

  3. "...we recommend you look into file hosting services  such as DropBox, rsync, rsnapshot, rdiff-backup, and so forth"

    If bitbucket supports as a file sharing service, I will never use DropBox. ^^

  4. Anonymous

  5. Anonymous

    I mistakenly uploaded a large file in a revision. I would like to get rid of that file from the repository. Is there a way to do so?

  6. Is there really no limit? I've been trying to add statically compiled support libraries uploaded so that collaborators don't have to compile those, and I get 'connection resets by peer' when I try to push the commit.

    How to I cancel a hg add; hg commit pair to try to back up and try with a smaller packaging?

  7. Anonymous

    I want to keep my "My Documents" and Photos under git. To be able easily synchronize it on multiple devices, delete from computer and restore later. But I want to be polite and respectful.

    Can bitbucket add special offer for users like me?

    Something like:


    • 1G, 5 users - $30/year
    • 5G, 5 users - $50/year
    • 25G, 5 users - $100/year

    And no limitation on file types?

    1. Anonymous

      Why you want to store non-text files in a git repo anyway ? You won't get any version controlling benefit.

      1. Anonymous

        Because they want: "To be able easily synchronize it on multiple devices, delete from computer and restore later."

        1. No, that's not what Bitbucket is intended for.

    2. Anonymous

      Why not use Dropbox(referral)? It "Will synchronize on multiple devices, delete from computer and restore later" it sounds exactly like what you want all of your computers will be kept in sync automatically if you need to review a previous version of any type of file they are all stored for 30 days and forever if you pay for restores.

    3. +1 for this. I need more spase for big project with large assets

  8. Hello,

    I'm trying to do a push and it doen't work... I'm developing a singer web and I'm trying to push about 20 audio files (mp3 & ogg) of about 35Mb. I don't know if it is the error. The audio files are property of the singer and are final version cutted and losed quality.

    I have, in this commit too, a file with the 'ç' caracter, I don't know if the error could be this.

    Thanks, and sorry for my english!


    1. Jordi,

      You need to supply more information for us to diagnose your issue.  Please send an email to and include:

      • the exact command you are issuing
      • the output of that command



  9. Anonymous

    I'm glad that you place no limits on the file uploads. However I have a project (piler), and I'd like to host a virtualbox image of a ready to use installation. It's size is ~500 MB, and I would keep always the latest version only. Is it eligible for the fair use? If so, how am I supposed to upload such a file? An RTFM link is just fine.

    1. Anonymous

      It'll be better to use some file hosting like dropbox.

  10. I recently learned that bitbucket also supports LateX projects, so I also host my PhD thesis into a git repo. Really awesome btw! However I wonder about the pictures ...  Naturally I got a couple of schematics and drawings in my thesis (so a few 100 kB per picture). I wonder if its fair to keep those also in my repo?

    1. Use your best judgement.  You have the right attitude to consider whether it is fair or not.  You could put your images in one repo an then use a another for your content.  The subject came up on stack overflow:

  11. Anonymous

    BitBucket, if you were over here, I was gonna kiss you now LOL

  12. Anonymous

    Hi, i have rather large (in relation to the small code part) test files for a project right now that I need to share with the code. Like a 16 MB binary file. I see that you don't officially put limit on the repo content, but I wanted to make sure this is covered. 

    Is it?

    1. You might want to consider putting that media file in a sub module (repository).  That way, your activity in the main repo won't include that hefty file.  

  13. Anonymous

    Hi, I want to use bitbucket to store versions of my reports and R-files for the class. However, I want to store all materials for the class in the same directory and they are pretty large (adding 18 mb / week in pdfs and xls). What would be the best option? 

    1. If you don't expect these files to change frequently and/or if you aren't working on them with anyone else, you might want to consider if you really need to use version control for them.  Instead, if you want to just share them, Dropbox or some other service is a better place.  That said, there is nothing we do that prevents you from storing these files on Bitbucket.  

  14. Anonymous

    We are team of web developers we mostly develop simple websites. we want to use bitbucket version control.

    2% of our repository contains design related files eg. Photoshop files.

    Is it a good practice to use git repo for this on bitbucket?

    1. Use your best judgement.  You have the right attitude to consider whether it is fair or not.  You could put your images in one repo an then use a another for your content.  The subject came up on stack overflow:

  15. Anonymous

    One of way of gaining trust is to let yourself vulnerable to your customers.

  16. Anonymous

    Hi, My company has a paid account with you and as such we have several repositories on it.

    Is there a way to know how much space we are using in Bitbucket without having to enter in *all* repositories and look for that information? 



    1. No, we don't currently display the sum of all repo sizes anywhere. We do have it though and we're happy to tell you if you drop us a line at

  17. I am currently using Bitbucket as a tool for developing a script-heavy Skyrim mod. The repository recently hit the 1GB threshold, because I am storing the meshes and textures in the same repo (my few betatesters are no experts in Mercurial or DVCS in general). This repository is private and only accessed by me and my betatesters (at the moment I have two active testers), the distribution of the release versions is done through a different site (nexusmods). Does this still count as fair use?

    1. Thanks for taking the time to comment!

      I would say yes, you are in-bounds, with a few caveats:

      • Git does not manage large binaries very well and this can impact your performance both locally and over the connection to Bitbucket.

      • If you are actively cloning or downloading the full repository often, this is taxing the bandwidth on the outer edges of “fair.” If you’re only pushing and pulling the code in commits and occasionally pushing the binaries then you're more in-bounds.

      • The fact you’re distributing the mod on another site shows you are “playing nicely.” Thanks for that!

      I would strongly encourage you to review our blog post How to handle big repositories. This has some great ideas for managing binaries in a Git repository.

      In addition to the suggestions there, you can consider using a separate repository or dropbox to work with textures and meshes in development then only push final versions to your main repository to reduce the impact on your repository performance.  

      Again, thank you for taking the time to comment, we appreciate your input, please let us know if this helps.

      1. Hi Dan Stevens and thanks for your insightful answer.

        The blogpost was really useful, I'll spend some time to evaluate if a repository migration from hg to git and the usage of submodules for the textures/meshes is worth the effort.

        Is there a particular reason why your reply focuses on git, while I mentioned that the repository uses hg at the moment? Would be using the suprepos functionality from hg be an alternative or is there a good reason for favoring git submodules?

        p.s. the individual binaries aren't that large, a few MB at most per file, and don't change often.So far we did not encounter any performance issues. The repository is only cloned once in a few months, usually when a new betatester joins the team. The main work are just pushs/pulls of the latest commits.

        1. Ugh. I truly apologize for focusing on git that was completely an oversight on my part. We do have some good suggestions in the topic: Maintaining a Mercurial Repository. In addition to this there are some good articles which will pop up in a simple "Large Mercurial repositories" search. 

          The more you describe it the more it sounds like you're using your repo very fairly. The one thing I would say is do keep an eye on your repo size, run some of the clean up suggestions from time to time, and look for a spot where you might split it into two projects if that makes sense; you get unlimited repositories so.... (smile)

          Again sorry about neglecting to give you a good mercurial option or comments. Sometimes I read and reply too quickly.

          1. Thanks for your support and the additional material. After the next release in a few weeks there will be a good oppurtunity to split off the assets into a subrepo and do some cleaning as you suggested. (smile)

      2. Given the way that Git handles repository history by always pointing to and reusing (not duplicating) any unchanged content, would it be correct to say that...

        • Even very large files won't add more repository size in future commits so long as they do not change, and
        • Converting an existing repository to using submodules for subsequent commits does nothing to reduce the size already attained (though it might avoid increases in size if the large files in a submodule are changed in the future)?

        For that matter, would the same be true for Mercurial and converting to using subrepos?


  18. Is there a way to tell bitbucket to do a git gc on one of my repostitories (or better: set the variable to a small value)?

    The git repository takes 1.9 GB of space on the bitbucket server. However, if I clone it it's less than 80 MB big, which can even be shrinked by doing a git gc --aggressive.

    1. For the time being, please raise a support request to let us know to gc your repo. We're working on tweaking settings or providing a way to force a prune/gc in the future.

  19. Does Bitbucket provide an easy way for a user to review the sizes of their repositories?  If there are going to be any limits to size, there really should be some way for the user to see how big their repositories are at Bitbucket.  (Or if there already is a way, it would help to make it more visible and apparent.)  Thanks much.

    (p.s. I second the idea of providing a way to garbage collect / compress / clean up the copy of the repository on Bitbucket.)


    1. You can always find the currently calculated repository size on the Repository settings overview page. That is calculated from ALL data in the .git directory on our servers. As I said above, we're working on automating a process to force repos down to size once they've been cleaned.

      1. Actually, it seems that we removed the size from the overview page. It can now be found in the admin overview section:  

    2. I'm updating the doc with this information.

  20. Ahh, that's where the size is hidden.  I would have expected the current total repository size to be part of the repository's Overview information (e.g. along with Last updated, Language, or Access level) or else perhaps on the main Source page (since individual file sizes are reported there), and wouldn't have thought of it as one of the "Settings" (since once cannot "set" the size, as one can set any of the other settings).


    Thanks to all for the information about finding the repository size, and for the blazingly fast upgrade to this page to make it clear for all!  I am consistently impressed with the rapid response by you folks to make clarifications, improved documentation, responses to questions, etc.


    1. Interestingly, it previously was on the repository overview. We removed it during the fluid width redesign as we didn't think it was  useful. Then we added limits. We may bring it back again, in some way, to the overview soon. Thanks for your comments!


  21. The most likely reason I have encountered for repositories to grow large is the inclusion of binary resources (for example: sample data files used in test harnesses; library assemblies that the project code depends on; graphic-rich content for use in documentation). Do you have any recommendations for how best to manage those assets, since in many cases it is important that the correct versions of those assets are available?

    1. In general, unless you have built those libraries yourself and/or you maintain the source of them, you shouldn't bother copying them at all into source control. Instead, a simple file host, or their original source, will do well. Then, when building your project, pull them in as dependencies.

      If you do build them as well, then they should be in separate repositories, one per library that you maintain, and again, build them on their own and pull in the dependencies as you need them.

      Depending on your language and process, there are numerous tools available for managing external dependencies for your project. Some that are commonly used are Maven for Java, NuGet for .Net, Bundler for Ruby and Pip for Python.


      1. That's certainly a fair recommendation, but it arguably introduces an additional risk (and a risk that may not be considered acceptable by management). While it is not likely that services like the ones you mention will fail it is a possibility. In those circumstances it may be difficult (perhaps even impossible for in-house generated binary content such as the test files) to locate the correct version(s) of those resources - a problem which doesn't exist if those resources are available within some sort of version controlled repository.

        1. For the tools I mentioned, they are client applications. This means they are able to pull dependencies from a local drive, remote URL or the built-in sources. You can take your third party libraries and place them in secondary repositories on Bitbucket even. Then they would be hosted in the same place your main codebase is, but they wouldn't clutter your actual application code. This is how we manage Bitbucket actually. We have a few libraries we maintain local forks of. We use a direct URL to our own internal systems to pull in those libraries when building up a new server, upgrading or setting up our development environments. We use Git's tags/branches to ensure we get the version we intend when pulling from those sources.

          Finally, if you require binary assets (like in game development) to be in the same repository, I'd recommend another VCS like Perforce. It is designed differently and made to handle larger files. I'd still recommend against placing third party code in the same repository.

      2. "Depending on your language and process, there are numerous tools available for managing external dependencies for your project. Some that are commonly used are..."

        and ___???___ for C++?

        1. I found this question and answer on StackOverflow: Unfortunately, we cannot provide an exhaustive list for every language, but there are certainly tools for just about everything out there.

  22. OK, so my private repo was flagged at 2.2 GB. It started life as an SVN repo, and was converted to a git repo and brought along too much curft.

    I ran the lovely BFG repo cleaner and now the bare repo is 730 MB. Yay!

    I pushed the bare repo back to bitbucket via a "git push" command (that's it, no other arguments). This seemed to work (git reported that this was a forced update, pushed back to

    But now Bitbucket is is reporting that my repo is 2.9 GB! It appears that it has added my newly-compressed repo to the existing bloaty one. I've also waited a day, in case Bitbucket measures the repo size once a day on some scheduled batch job. Nope, it still says I have a 2.9 GB repo.

    I have also tried "git push --mirror" and git immediately responds with "Everything up-to-date" and takes no action.

    How can I push my new bare repo to Bitbucket so that it completely replaces the bloaty one, and makes the nag message go away?

    1. Hey Doug,

      Thanks for taking the time to comment. At the moment what I would have you do is contact support and request a Git gc your repository to clean it up. We are working on a way for you to be able to do this on your own. For the moment though just ask and we'll run it within a reasonable time and get you back in shape.

      Great work getting your repo trimmed on your end!!



  23. Just wanted to let you guys know I'm in the exact same situation as Doug. I'll contact support as Dan suggested.

    Waiting for a solution so that we can do this ourselves. We also have 3D meshes in the repository. They don't change very often, so I figured we can just erase their history from time to time, but pushing the bare repo didn't replace the existing one.



  24. After reading all this, I'd like to get rid of files in the repo - how? I deleted many files locally and .gitignore'd many more. But that does not seem to have any effect on the repo size on bitbucket. What now?

    1. Good day Ralf,

      If you haven't already done so, be sure you've actually removed the files from your local history as well as having deleted them from directories. If you've not done this or are unfamiliar with the procedure you will want to look at Maintaining a Git Repository before you do. Having said that here's the procedure, it works for any file not just "large" files.

       Remove large files by name

      Use the following procedure to remove large files:

      1. Run the following command to remove the first large file you identified:

        git filter-branch --index-filter 'git rm --cached --ignore-unmatch filename' HEAD
      2. Repeat Step 1 for each remaining large file.
      3. Update the references in your repository.
        This command iterates and udpates the original references and deletes any references to the large files you removed.

        $ git for-each-ref --format="%(refname)" refs/original/ | xargs -n 1 git update-ref -d
      4. Prune all of the reflog references from now on back.

        $ git reflog expire --expire=now --all
      5. Repack the repository by running the garbage collector and pruning old objects.

        $ git gc --prune=now
      6. Push all your changes back to the Bitbucket repository.

        $ git push --all --force
      7. Make sure all your tags are current too:

        $ git push --all --tags --force

      Once you've done that, you can send an e-mail to and ask them to run an additional git gc on your Bitbucket repository. That should address the problem. If not please let us know or open a support ticket.

      I hope this helps.

      Happy coding,


      1. Good day to you, Dan,

        Thanks for your extensive reply. Meanwhile I had a look at Maintaining a Git Repository and also at (So far I had only read the basics in

        This all looks like voodoo to me. And I'd like to remove about a hundred files (videos and images), not just a few large ones. So this single file removal does not look feasible. BFG could work, but I'm not using Linux. Would you know of a Windows tool to do this?

        Otherwise I could just delete the whole repo and recreate it with my .gitignore that I now have in place. Still, I'm surprised it's so hard to delete some files ...

        1. So I hear you loud and clear. One of the awesome things about Git is its redundancy. One of the most painful things about Git is its redundancy. (smile) (sad) all in one.

          So before you do anything rash like deleting and starting over consider the following:

          • How big is your repo? If your still well under the 1 gb soft limit you might consider just using a visual client like SourceTree to remove the files from your working copy. While this won't shrink your repo size on disk entirely, it will get them out of the way, so to speak. And since you've updated your git ignore you should be able to move forward pretty confidently. 

          Otherwise I'm afraid you might be stuck either deleting files in batches using the method I posted or deleting the repo then adding it back. If you do that you might consider the following:

          1. Pull your repo and be sure you have all your files.
          2. Create a back up copy, you know, just in case.
          3. Copy the files you want to keep in your new repository to a new directory.
          4. Choose one of the following:
            1. If you want to keep the same repo name and URL:
              1. Delete the repository from the Bitbucket
              2. Wait 24 hours.
              3. Recreate the repository in Bitbucket

              4. Navigate to the directory you created for the repository:

              5. run git init
              6. Copy the command from I have an existing repository on the Repo setup page of your new repository. Something like:

              7. Then pus everything up. (smile)
            2. If you don't care about the repository name / url you can just:
              1. Create a new repo in Bitbucket
              2. Clone to your local system.
              3. Copy the directory and files you want to keep into the cloned directory.
              4. Push everything back to Bitbucket.
              5. Keep the old repo as a history and back up, it doesn't cost you anything. Or delete it, if you'd just rather forget it. (smile)

          I'm sorry I don't have a better answer for you. I did try to run remove files from SourceTree since it's visual but then you have to acutally manage the rest of the deletion process anyway (sad).

          I hope this helps, if I find a better solution I'll add it to this discussion.

          Happy coding,


  25. We have a large repo, it includes the following message when we navigate to it in a browser:

    This repository's size is over 1 GB. Read more on how to reduce the size of your repository.

    Notice the link comes to this page. I don't seem to see info on reducing the size of our repo here? Am I missing something?

    Other than having quite a lot of history, and a relatively large project to begin with, we had a developer commit a huge bundle of sample data. We got it removed but couldn't rebase it out because of the number of folks that would have had to coordinate. Any advice for us would be greatly appreciated!


       we had a developer commit a huge bundle of sample data. We got it removed but couldn't rebase it out

      Yes, as long as you don't rebase it out, it's not gone and so your repo will not reduce in size. The only way to remedy this is to really remove it from the history. This will indeed require some coordination among your developers to prevent the file from making its way back in later on.

      Some help on doing this can be found here:


  26. I am getting warned that my repo size is over 1GB. However, when I run

    size pack is at 600MB. Why would there be such a large difference between how big Bitbucket says the repo is and what Git says?

    1. That has to do with the way git uses. With every commit you create objects. Such objects contain the complete file that was edited. Once in a while (depending on the number of objects, which you fine tune with git config git performs a git gc, which makes an efficient pack out of all the single objects.

      So your reported 600MB is the size of your local pack, whereas on the Bitbucket server, the git gc has not occured yet. So on the Bitbucket server, there are still many loose objects which are way less effecient that your pack. Unfortunately, you cannot do a git gc on the server, you have to ask someone from Atlassian to do that for you.

      It would be nice though to have an option on Bitbucket to perform a git gc yourself. Would not be too hard to implement.

    2. Update: I did contact Atlassian Support and they ran 


      on the server side. This cut our repo size in half!