<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1%20https://www.facebook.com/tr?id=1063935717132479&amp;ev=PageView&amp;noscript=1">


Taking Infrastructure-as-Code further

BitOps is a portable Docker image that understands a defined structure, called an 'Operations Repo', and ensures your application's declarative infrastructure is organized optimally and executed according to your schema.

Jump in with the Introducing BitOps guide

The Operations Repository

BitOps' Operations Repository, or 'Ops Repo' for short, is where all of the magic happens. By organizing each piece of your application and infrastructure configuration in a declarative manner, BitOps can reliably and easily deploy your code to a multitude of environments using your favorite IaC tools.

An Ops Repo structure means easy blue-green deployments, as code from a single repo can be shared by many environments and replicated to individual cloud instances.

Pluggable Lifecycle

We know not all code conforms to any one specific standard, which is why we built Bitops with its lifecycle in mind.

Each BitOps plugin is configured to execute before and after deploy scripts, making it simple to adapt any existing environments to an Ops Repo.

Cloud à la Carte

Bitops is integrated with cloud service providers which makes it effortless to quickly provision and teardown complete instances of your app, working in tandem with any servers of your own.

An Operations Repo is great for scaling as well. Whether that's more powerful servers, or greater numbers, BitOps is flexible enough to meet your changing needs.


Simply Secure

The docker container that executes BitOps destroys itself upon completing execution meaning that any Secrets or sensitive information that has been exported to the environment is erased along with the container.

Combining this with the before and after lifecycle scripts allows BitOps to configure your environment, how you need it, regardless of where your parameters are stored.

Jump in with the Introducing BitOps guide

Why use BitOps?

BitOps simplifies deployment pipelines by providing pre-installed deployment tools and YAML based configuration schemes for both infrastructure and the artifacts which are deployed onto those frameworks.

BitOps takes complex, multi-step configurations, and transforms them into an agnostic, repeatable, and streamlined method for deployment and scaling.

For more information about what BitOps is and what it can do for your applications - check out our tutorial series starting with Introducing BitOps!

Supported Tools

ansible           aws-image-transparent       cloudformation        docker      helm     terraform

Don't see your favourite tool? Let us know!
Prior to partnering with Bitovi, our DevOps involved a lot of manual intervention which was time consuming and error-prone. That's all changed with Bitovi's help and their DevOps management tool, "BitOps." Our operations have been streamlined using infrastructure-as-code which makes upgrades and maintenance easy.

– Jason C, Business Alliance Financial Services

Running BitOps

BitOps is available on GitHub, as well as a Docker image via DockerHub.

To run BitOps, an Operations Repository is required which defines your infrastructure. Check out the Yeoman generator to create a basic 'ops repo' structure instantly.

Executing BitOps is as simple as :
docker run bitovi/bitops -v $(pwd):/opt/bitops_deployment
For more information on more ways to use BitOps, check out the official docs, or one of our quick start guides.

bitovi.combitops - diagram

Explore BitOps Resources

Looking to get started with BitOps? Check out some of our resources below!

Why We Made BitOps

Bitovi created BitOps as a response to a frequent need by clients to create a portable, flexible, method of describing a client’s existing complex application infrastructure across many environments in a rigourous manner. We quickly saw benefits, as utilizing BitOps allows Bitovi to approach a broad variety of environments in a modular way and more efficently meet expanding client needs for cloud infrastructure migrations.