Munki, Gitlab & Git-LFS

NOTE: This is not a HOW TO, more of a rant, the how to that may or may not come later. This is written from the perspective of someone who manages (for the most part) their infrastructure from end to end, vm config, service management, monitoring and patching. When it comes to scale, and complexity your milage may very.

tl;dr

  • there are some caveats this route
  • its worth it
  • git isn't a one way street, you can use it as little or as much as you want
  • Gitlab as a tool sets you up for more in the CI/CD realm

The idea

This post is just some gernal process / lessons learned from migrating a Munki repo to a git tracked repository

fwiw, "git" and "git-lfs" don't need Gitlab, its just my life, and these are my thoughts.

Rationale:

  • vcs
    • a version control system was huge, the benefit to see what I changed when I changed was really rad
  • pipelines
    • though I didn't have that word for it at the time, some flow of dev/test/deploy of a repo
  • multiple users
    • though this still isn't prod, I wanted the ability for multiple users to make changes, and said changes be tracked
  • no more smb/sfp
    • all the transport security can be done with pub/priv keys, etc

Why Gitlab?

  • I could keep it locally
    • OR use the cloud, privately too
  • The CE edition is rad (and I will do my best to link exclusively to the Community Edition docs)
    • and free
  • It has CI/CD built in
    • (though I didn't realize how great this was 'till later)

The Reality

There were some kinks that had to be worked out, the first and maybe most obvious was managing large files with git. Gitlab has LFS support built in, which is rad, but still comes with its own nuances.

Git-LFS

Git Large File Storage

Git Large File Storage (LFS) replaces large files such as audio samples, videos, datasets, and graphics with text pointers inside Git, while storing the file contents on a remote server like GitHub.com or GitHub Enterprise.

Heres a handy tutorial: Getting started with Git LFS

Git LFS on Gitlab

Sounds great, and can be enabled in Gitlab easily.

LFS on Gitlab also requires you use HTTPS for auth and transport, rather than ssh. Digging into the Gitlab administration docs, we can see the docs straight list some pretty big limitations:

Support for removing unreferenced LFS objects was added in 8.14 onwards.
LFS authentications via SSH was added with GitLab 8.12
Only compatible with the GitLFS client versions 1.1.0 and up, or 1.0.2.
The storage statistics currently count each LFS object multiple times for every project linking to it

But some stuff they don't tell you that I found out the hard way-

LFS caches (this, srsly)

What I found was when Gitlab was receiving LFS file sit could cache them in a /cache location, then move them to the configured storage location.

This was noticed when my OS disk on my VM filled. womp. Modifying the gitlab.rb you can change the LFS storage location.

Looking at gitlab.rb.template, we have these LFS options:

### Git LFS
# gitlab_rails['lfs_enabled'] = true
# gitlab_rails['lfs_storage_path'] = "/var/opt/gitlab/gitlab-rails/shared/lfs-objects"
# gitlab_rails['lfs_object_store_enabled'] = false # EE only
# gitlab_rails['lfs_object_store_direct_upload'] = false
# gitlab_rails['lfs_object_store_background_upload'] = true
# gitlab_rails['lfs_object_store_proxy_download'] = false
# gitlab_rails['lfs_object_store_remote_directory'] = "lfs-objects"
...

Oh nice! a gitlab_rails['lfs_storage_path'] option, sweet, so you can store your repo on /dev/sd(whatever), this is good to know- so say- your os disk doesn't fill...

And what about your client? Most people have Munki running with autopkg or munki admin on a macOS box. So you have to have Git-LFS on your mac, and little less supported.

LFS on macOS

You can do this via the installer git provides (macOS), which is basically the command line extension and a shell script. Or you could use brew, loads of people have loads of opinions about brew, so don't use it and just keep that to yourself.

Regardless you will need to initialize it-

# Update global git config
$ git lfs install
# Update system git config
$ git lfs install --system

Which is like HEY GIT we going to use LFS now. But for what? and when?, touché git.

Looking into Configuring Git Large File Storage you can, once you're in a tracked dir,

$  git lfs track "*.psd"
Adding path *.psd

Which is cool, and it gets added to your .gitattributes file, but most admins know what they're big files in their munki repo are... so somehting like this in your .gitattributes file may be more applicable:

*.pkg filter=lfs diff=lfs merge=lfs -text
*.mpkg filter=lfs diff=lfs merge=lfs -text
*.dmg filter=lfs diff=lfs merge=lfs -text
It still CACHES

Lets take a look at our mcOS git env...

$ git lfs env
Endpoint=https://your.gitlab.com/group/munki_repo.git/info/lfs
LocalGitDir=/your/munki_repo/.git
LocalGitStorageDir=/your/munki_repo/.git
LocalMediaDir=/your/munki_repo/.git/lfs/objects
LocalReferenceDir=
TempDir=/your/munki_repo/.git/lfs/tmp
...
LfsStorageDir=/your/munki_repo/.git/lfs
...

That effectively means your local repo folder can be double its actual size. So plan accordingly.

Once its tracked it is as simple as a git push to get those files and changes to the Gitlab server.

Git

I know it sounds pretty negative up until this point, the benefit though, of the hump of getting it setup is git. And you can "git" as much or as little as you want.

Munki & Git

Theres some great documentation on that here. But specifically check out Munki's Repo Plugins, specifically the GitFileRepo:

GitFileRepo - a proof-of-concept/demonstration plugin. It inherits the behavior of FileRepo and does git commits for file changes in the repo (which must be already configured/initialized as a git repo)

This is rad because once your repo is tracked it does git commits for file changes in the repo. Rad.

Git Theories

master Branch All Day {#-master-branch-all-day}

Once you have all your stuff in git, you can choose how you're going to use it- I have seen a lot of benefit of a tracked repo that just simply commits changes to master. Since dev, test, and prod are all contained in "munki logic" commit any changes to master allows you to track any changes made to .pkginfo or manifest files.

Which this works, and is totally legit, and will get you a load of good info from tracked files.

Let's Face(book) It, we need more

Or maybe you don't but Facebook's CPE team has a really rad option...

Check out their CPE resources, specifically the autopkg_tools)...

This is an AutoPkg) wrapper script creates a separate git feature branch and puts up a commit for each item that is imported into a git-managed Munki repo.

They have a Getting Started Guide if this sounds like more of what you're looking for-

Stuff I didn't touch on, that I could

  • catalogs, not tracking them, making them, making them with a runner
  • git-fat as an option?

In summary

I personally won't go back, there was a little tweaking to get it all sorted, but the information and tracking git provides to the munki repo is well worth it.

Coming Soon

  • I am going to try and sanitize some of the helpful CI/CD stuff I got rolling in Gitlab and talk about it here.
  • Munki in the cloud stuff
  • macOS monitoring?
  • maybe a similar rant on osquery and fleet.io

Resources


Macadmin Resources

🎥 Mac Justice - Intro to Gitlab (MacDevOPs, shorter)
🎥 Mac Justice - Intro to Gitlab (PSU Macadmins, longer))

🔗 Advanced Munki Infrastructure: Moving to Cloud Services by Rick Heil
🎥 Advanced Munki Infrastructure: Moving to Cloud Services

General Resources

🔗 Git LFS
🔗 Gitlab CE
🔗 Gitlab and LFS)

🎥 Git Large File Storage - How to Work with Big Files
🎥 Git LFS Training - GitHub Universe 2015
🎥 Tracking huge files with Git LFS, GlueCon 2016