macOS Compatibility Fun!

Compatibility Questions

If you work with Macs and Jamf then you know every year there’s a new per OS Extension Attribute (EA) or Smart Group (SG) recipe to determine if macOS will run on your fleets hardware. However I asked myself: What if a single Extension Attribute script could fill the need, requiring only a periodic updating of Model IDs and the addition of new macOSes?

Then I also asked: Could this same script be re-purposed to output both text and CSV, not just for the script’s running host but for a list of Model IDs? And the answer was a resounding yes on all fronts!

EA Answers

So, my fellow Jamf admin I present to you macOSCompatibility.sh in its simplest form you just run the script and it will provide ultra-sparse EA output like: <result>10.14 10.15 11</result> this could then be used as a Smart Group criteria. Something like “macOS Catalina Compatible” would then match all Macs using LIKE 10.15 or “Big Sur Incompatible” would use NOT LIKE 11, of course care would be taken if you were also testing for 10.11 compatibility, however the versionsToCheck variable in the script can limit the default range to something useful and speeds things up the less version there are. I hope this helps Jamf admins who have vast unwieldy fleets where hardware can vary wildly across regions or departments,

CSV Answers

Now if you provide a couple arguments like so: ./macOSCompatibility.sh -c -v ALL ALL > ~/Desktop/macOSCompatibilityMatrix.csv you will get a pretty spiffy CSV that let’s you visualize which Mac models over the years have enjoyed the most and least macOS compatibility. This is my favorite mode, you can use it to assess the OS coverage of past Macs.

See macOSCompatibilityMatrix.csv for an example of the output. If you bring that CSV into Numbers or Excel you can surely liven it up with some Conditional Formatting! This is the barest of examples:

Can you spot the worst and best values?

Text Answers

If you don’t use the -c flag then it’ll just output in plain or text, like so: ./macOSCompatibility.sh -v ALL ALL

iMacPro1,1: 10.13 10.14 10.15 11
MacBook1,1: 10.4 10.5 10.6
MacBook2,1: 10.4 10.5 10.6 10.7
MacBook3,1: 10.5 10.6 10.7
MacBook4,1: 10.5 10.6 10.7
MacBook5,1: 10.5 10.6 10.7 10.8 10.9 10.10 10.11
MacBook6,1: 10.6 10.7 10.8 10.9 10.10 10.11 10.12 10.13
MacBook7,1: 10.6 10.7 10.8 10.9 10.10 10.11 10.12 10.13
MacBook8,1: 10.10 10.11 10.12 10.13 10.14 10.15 11
MacBook9,1: 10.11 10.12 10.13 10.14 10.15 11
MacBook10,1: 10.12 10.13 10.14 10.15 11
MacBookAir1,1: 10.5 10.6 10.7
MacBookAir2,1: 10.5 10.6 10.7 10.8 10.9 10.10 10.11

Wrapping Up

Now, it’s not totally perfect since some models shared Model IDs (2012 Retina and Non-Retina MacBook Pros for example) but for the most part the Intel Mac Model IDs were sane compared to the PPC hardware Model IDs: abrupt jumps, overlaps, and re-use across model familes. Blech! I’m glad Apple “got religion” for Model IDs (for the most part) when Intel CPUs came along. I did attempt to go back to 10.1-10.3 with PPC hardware but it was such a mess it wasn’t worth it. However testing Intel, Apple Silicon and VMs against macOS 10.4 – 11+ seems to have some real use and perhaps you think so too? Thanks for reading!

jpt + jamf uapi = backupJamf-scripts

Jamf UAPI: JSON Only

I am a creature of habit, no doubt, however sometimes you must get out of your comfort zone. The Jamf Universal API (UAPI) is one such case, it is JSON only and not XM. Those tried and true xpath snippets will no longer work with JSON, in fact what tool do you use with JSON? macOS really doesn’t have a good built-in JSON tool and if your scripts are client side do you really want to have jq as a dependency? Good thing I wrote a JSON parser you can embed in your scripts this summer! In fact, when I finished writing my JSON power tool jpt, I needed to find some practical examples to demonstrate its utility. Looking at the UAPI it’s clear some parts are still a work-in-progress, however the endpoint for scripts is actually really good. It gives you everything in one go almost like a database query. That should made backing up scripts a breeze!

backupJamf-scripts. boom.

If you’ve used the Classic API from Jamf, it is a 1:1 ratio of scripts to API calls: 2 scripts? 2 API calls via curl. 200 scripts? 200 API calls via curl. The new Universal API reduces that down to 1 call to get everything (plus one call to get the token), it’s super fast and I love it. Check out backupJamf-scripts.command in my newly minted jamfTools repo on GitHub for a working demostration of both the Jamf UAPI and jpt’s in-script JSON handling. I hope you like it!

jpt + jamf uapi = scripts downloading scripts

Jamf & FileVault 2: Tips & Tricks (and more)

Raiders of the Lost Feature Requests

So there’s this old feature request at Jamf Nation (stop me if you’ve heard this one…) it’s almost 6 years old: Add ability to report on FV2 Recovery Keys (and/or access them via API) In fact, maybe you came here from there, watch out don’t loop! Continue!

The pain point is this: Keys are sent back to Jamf Pro (JSS) but then can only be gotten at manually/interactively through the web interface, not via API nor another method. For cases of mass migration to another JSS it sure would be nice to move those keys over rather than decrypt/re-encrypt. Well, I’ve got a few insights regarding this that I’d like to share that may help. ‘Cuz hey it’s 2020 and we’ve learned that hoarding is just silly.

Firstly, it should be pointed out that neither ye olde “Recovery Key Redirection” payload nor it’s replacement “Recovery Key Escrow” are needed to get keys to the JSS. There is another method and it’s what is used by the built-in “Filevault Encryption” policy payload to get the keys back to your JSS. Jamf references this method in this old script at their GitHub. I revamped the core bits a couple years ago in a (nearly 7 year old) feature request: Manually Edit FileVault 2 Recovery Key

Telling the JSS Your Secrets

The takeaway from that is to realize we have a way to explicitly send keys to the JSS by placing 2 XML files in the /Library/Application Support/JAMF/run folder: file_vault_2_id.xml and file_vault_2_recovery_key.xml. Also note, Jamf has updated the process for the better in the last two years: a jamf recon (or two) is no longer required to send the key and validate it, instead JamfDaemon will send it immediately when both the files are detected. Which is nice, but it’s the subsequent recon validations where we have an opportunity to get grabby.

Cold Lamping, Hard Linking

So here’s the fun part: When recon occurs there’s lots of file traffic in /Library/Application Support/JAMF/tmp all sorts of transient scripts hit this folder. What we can do is make hard links to these files as they come in so when the link is removed in tmp another exists elsewhere and the file remains (just in our new location). EAGrabber.sh does exactly that (and a little bit more)

EAGrabber.sh can be easily modified to narrow it’s focus to the FileVault 2 key only, deleting the rest. What you do with the key is up to you: Send it somewhere else for safe keeping or keep it on device temporarily for a migration to another Jamf console. A script on the new JSS could then put that key on-disk into file_vault_2_recovery_key.xml file which will then import and validate, no decrypt/recrypt necessary. Hope this helps.

Cuidado ¡ Achtung ! Alert

Jamf admins take note: Do you have hard coded passwords in your extension attributes or scripts? If so, then all your scripts are belong to us. Now, go read Obfuscation vs. Encryption from Richard Purves. Read it? OK, now consider what happens if you were to add a routine to capture the output of ps aww along with a hard-linking loop like in EAGraber.sh. If you are passing API credentials from policies via parameter, then ps can capture those parameters and even if you try and obscure them, if we’ve captured the script we can de-obfuscate them. This is a good reason to be really careful with what your API accounts can do. If you have an API account with Computer record Read rights that gets passed into a script via policy and you use LAPS, then captured API credentials could be used to harvest LAPS passwords via API. Keep this in mind and we’ll see if any meaningful changes will occur in recon and/or the script running process in the future (if you open a ticket you can reference PI-006270 regarding API credentials in the process list). In the meantime make API actions as short lived as possible and cross your fingers that only you, good and noble #MacAdmins read this blog. 🤞

jpt: jamf examples pt. 2

Over in the MacAdmins’ #bash channel I saw a I question regarding how to get the Sharing states of Bluetooth devices from system_profiler. The most succinct answer was to awk out the values:

system_profiler SPBluetoothDataType 2> /dev/null | awk '/State: / {print $2}'
Disabled
Disabled
Disabled

If you are using this for a Jamf Extension Attribute, I suppose it’ll do if you never want to allow any of them to be Enabled, but what if Internet Sharing was OK but not File Sharing? How would you match your Smart Group to multiple lines of unlabeled values? How would you match the first two but not the last two… and what if there was another USB Bluetooth device, that would add extra rows. Hmmm…

The answer for me, outputting the service name and the state on the same line. Since there isn’t a consistent line count from State: going back the service name, using something like grep -B n to include n lines of preceding data isn’t going to work.

      Services:
          Bluetooth File Transfer:
              Folder other devices can browse: ~/Public
              When receiving items: Accept all without warning
              State: Disabled
          Bluetooth File Exchange:
              Folder for accepted items: ~/Downloads
              When other items are accepted: Save to location
              When receiving items: Accept all without warning
              State: Disabled
          Bluetooth Internet Sharing:
              State: Disabled

So you know what I say the answer to that is? That’s right, jpt the JSON Power Tool! It can parse the -json output from system_profiler in a more structured way and it allows for the discovery of as many applicable Bluetooth devices might be on the system.

Here’s a sample run with Internet Sharing turned On as well as Bluetooth Sharing turned On

file_browsing: disabled
object_push: enabled
internet_sharing: enabled

File Browsing is set to “Never Allow” but File Receiving is in the affirmative (Accept and Open, Accept and Save, or Ask). The addition of labels gives us the ability to create a Smart Group to match specific services like “file_browing: enabled” or any other combination thereof (perhaps internet_sharing should always be enabled, who am I to say what your requirements are!).

About the jpt

The JSON Power Tool (jpt) is a parser/manipulator for JSON documents written in Javascript and shell and can run standalone or embedded in your scripts bash or zsh and all the way back to OS X 10.4 Tiger! Check it out at: https://github.com/brunerd/jpt

jpt: jamf examples

jpt has some practical applications for the Jamf admin

Sanitizing Jamf Reports

Let’s say you’ve exported an Advanced Search, it’s got some interesting data you’d like to share, however there is personal data in it. Rather than re-running the report, why not blank out or remove those fields?

Here’s a sample file: advancedcomputersearch-2-raw.json

This an excerpt from one of the computer record:

{
"name": "Deathquark",
"udid": "2ca8977b-05a1-4cf0-9e06-24c4aa8115bc",
"Managed": "Managed",
"id": 2,
"Computer_Name": "Deathquark",
"Last_Inventory_Update": "2020-11-04 22:01:09",
"Total_Number_of_Cores": "8",
"Username": "Professor Frink",
"FileVault_2_Status": "Encrypted",
"JSS_Computer_ID": "2",
"Number_of_Available_Updates": "1",
"Model_Identifier": "MacBookPro16,1",
"Operating_System": "Mac OS X 10.15.7",
"Model": "MacBook Pro (16-inch, 2019)",
"MAC_Address": "12:34:56:78:9A:BC",
"Serial_Number": "C02K2ND8CMF1",
"Email_Address": "frink@hoyvin-glavin.com",
"IP_Address": "10.0.1.42",
"FileVault_Status": "1/1",
"Processor_Type": "Intel Core i9",
"Processor_Speed_MHz": "2457",
"Total_RAM_MB": "65536"
}

Now let’s say the privacy standards for this place is GDPR on steroids and all personally identifiable information must be removed, including serials, IPs, UUIDs, almost everything (but you don’t want to run the report again because it took ages to get the output)!

Here’s what that command would look like: jpt -o replace -v '"REDACTED"' -p '$["advanced_computer_search"]["computers"][*]["name","udid","Username","Computer_Name","MAC_Address","Serial_Number","Email_Address","IP_Address"]' ./advancedcomputersearch-2-raw.json

Here’s what that same computer looks like in the resulting output (as well as all others in the document):

  {
    "name": "REDACTED",
    "udid": "REDACTED",
    "Managed": "Managed",
    "id": 2,
    "Computer_Name": "REDACTED",
    "Architecture_Type": "i386",
    "Make": "Apple",
    "Service_Pack": "",
    "Last_Inventory_Update": "2020-11-04 22:01:09",
    "Active_Directory_Status": "Not Bound",
    "Total_Number_of_Cores": "8",
    "Username": "REDACTED",
    "FileVault_2_Status": "Encrypted",
    "JSS_Computer_ID": "254",
    "Number_of_Available_Updates": "1",
    "Model_Identifier": "MacBookPro16,1",
    "Operating_System": "Mac OS X 10.15.7",
    "Model": "MacBook Pro (16-inch, 2019)",
    "MAC_Address": "REDACTED",
    "Serial_Number": "REDACTED",
    "Email_Address": "REDACTED",
    "IP_Address": "REDACTED",
    "FileVault_Status": "1/1",
    "Processor_Type": "Intel Core i9",
    "Processor_Speed_MHz": "2457",
    "Total_RAM_MB": "65536"
  }

What we did was use the -o replace operation with a -v <value> of the JSON string "REDACTED" to all the paths matched by the JSONPath union expression (the comma separated property names in brackets) of the -p option. JSON Pointer can only act on one value at a time, this is where JSONPath can save you time and really shines.

jpt is fast because WebKit’s JavascriptCore engine is fast, for instance there is a larger version of that search that has 15,999 computers, it took only 11 seconds to redact every record.

jpt -l $.advanced_computer_search.computers ./advancedcomputersearch.json
15999

time jpt -o replace -v '"REDACTED"' -p '$.advanced_computer_search.computers[*]["name","udid","Username","Computer_Name","MAC_Address","Serial_Number","Email_Address","IP_Address"]' ./advancedcomputersearch.json > /dev/null 

11.14s user 0.35s system 104% cpu 11.030 total

Put Smart Computer Groups on a diet

When you download Smart Groups via the API, you will also get an array of all the computers objects that match at that moment in time. If you just want to back up the logic or upload to another system, you don’t want all those computers in there.

Sample file: smartgroup-1-raw.json

{
  "computer_group": {
    "id": 12,
    "name": "All 10.15.x Macs",
    "is_smart": true,
    "site": {
      "id": -1,
      "name": "None"
    },
    "criteria": [
      {
        "name": "Operating System Version",
        "priority": 0,
        "and_or": "and",
        "search_type": "like",
        "value": "10.15",
        "opening_paren": false,
        "closing_paren": false
      }
    ],
    "computers": [
      {
        "id": 1,
        "name": "mac1",
        "mac_address": "12:34:56:78:9A:BC",
        "alt_mac_address": "12:34:56:78:9A:BD",
        "serial_number": "Z18D132XJTQD"
      },
      {
        "id": 2,
        "name": "mac2",
        "mac_address": "12:34:56:78:9A:BE",
        "alt_mac_address": "12:34:56:78:9A:BF",
        "serial_number": "Z39VM86X01MZ"
      }
    ]
  }
}

Lets’ remove those computers with jpt and the JSON Patch remove operartion:
jpt -o remove -p '$.computer_group.computers' ./smartgroup-1-raw.json

Since the target is a single property name, JSON Pointer can also be used:
jpt -o remove -p /computer_group/computers ./smartgroup-1-raw.json

{
  "computer_group": {
    "id": 12,
    "name": "All 10.15.x Macs",
    "is_smart": true,
    "site": {
      "id": -1,
      "name": "None"
    },
    "criteria": [
      {
        "name": "Operating System Version",
        "priority": 0,
        "and_or": "and",
        "search_type": "like",
        "value": "10.15",
        "opening_paren": false,
        "closing_paren": false
      }
    ]
  }
}

Further Uses

Using only JSON Patch replace and remove operations this JSON could have its id removed to prep it for an API POST on another JSS to create a new group or the name and value could be modified in a looping script to create multiple JSON files for every macOS version. The jpt is flexible enough to handle most anything you throw at it, whether you are using the standalone version or have embedded it in your scripts.

Stop by the jpt GitHub page to get your copy