A better way to do ADSK Deployment Logs

Well, it’s that time of year again, and I have been madly building & testing Autodesk deployments and Px Tools recipes. And, as in previous years, I have been frustrated with the way deployment logs work. But this year, I think I happened upon a solution!

Default Behavior

First off, a little background. When you build a deployment, the default is create network log file checked, with no log file path defined. This is in addition to local logs in Temp checked and silent install checked.

Running the deployment silently is self evident, why would you ever NOT want the deployment to run silently? Also the local log file setting has minimal impact. If you don’t select it, the user temp folder is still going to be full of individual logs. These local logs are much more detailed than the network log, and sometimes these are the only logs that provide the detail you need. But the network log will help you determine which of the detailed local logs to look in. You really need both, so might as well leave that selected too.
But, back on topic, with the Network log file checked, but no path defined, you will get the default path in your deployment INI file, and the log file itself will be named after the deployment itself.

This results in a single common network log file, shared by every single machine that runs this deployment, with each machine appending its hundreds of lines of log data at completion.

It is that appended data behavior where things go sideways. When there are a lot of machines using that log file it can get really frustrating to go back through a bunch of successful machines’ log lines to find that one machine that had problems. And that scenario is exactly what occurs when I am testing deployments & recipes on Windows 7, Windows 8.1 and a bunch of Windows 10 builds. It’s also what happens when you lot in the trenches are trying to roll out an office full of machines all at once. Also, because the logs files are scattered throughout the deployment folders, it’s not easy to delete those log files after a round of installs, so the files tend to just get bigger and bigger, which makes them even more frustrating to use when I need to scroll through half a decade of test installs to get to the error I just ran into when testing say a Revit 2016 install on a Windows 10 1909 VM. Ugh!

This year, when contemplating the prospect of this pain, and also having some extra thinking time and a vino quarantino in hand, I decided to look for a solution.

First Idea

In various situations you can use an environment variable in a path and whatever program that uses that path will expand the environment variable before actually using the path. You just wrap the environment variable name in % % in the path. You may well be familiar with this already, as %temp% in Windows Explorer or Search is a nice shortcut to C:\Users\USERNAME\AppData\Local\Temp where USERNAME is the logged in user. And indeed, %USERNAME% is also a quick shortcut to just that user name, in situations where Environment Variables are expanded.

This is by no means a universally available option. For example, one might want to use %USERNAME% in the or nodes in a seeded DynamoSettings.xml. But alas, that isn’t an option, because Dynamo doesn’t expand Environment Variables.

However, back on topic, %COMPUTERNAME% is also an available Environment Variable, and Autodesk deployments do expand Environment Variables in log file paths! So, my first attempt at a fix was simply to add the environment variable to the end of the path in the Deployment INI, like so…

This results in a folder for every machine in the deployment Log folder…

Manifest location

… and within that a dedicated log file, with just that specific machines log data. Woot, the log is easier to peruse for errors, and then I can easily delete that machines folder and the log in it.

But we can take it a bit further.

My Current Approach

Rather than having a Log folder in each deployment, I am now using a single shared _Logs folder in the root of my Autodesk deployments folder. I have a share for all my install resources, that contains an ADSK folder, and that contains year folders that then contain the actual deployments for each year. In the root of that ADSK folder I have _Logs…

… and my log path is identical for every deployment now.

When I install I get a folder there for the specific machine…

… and that folder contains log files for every deployment I just ran on that machine, multiple products, multiple years if that’s what I ran, all in one place, and all just that one machine’s data.

I can easily find any error data I need because the log files themselves are machine specific, and when I am done I can delete an entire days worth of logs!
In an office scenario, I would roll out new installs office wide, review the logs of the specific machines that had issues, then delete all that day’s folders. Especially in a multiple location office, where I am rolling out 2021 to all the Rotterdam machines on Monday, all the Berlin machines on Tuesday, etc. this is a big improvement in workflow.

And this brings up the other place where the default behavior is potentially an issue. DFS, or Distributed File System. This is a Windows feature that is a real game change for multiple location offices. It basically allows you to have a single path to deployments in every office, but a local cache. So, you build your deployments at the “home” office, and they get distributed to all the other locations automatically. You get local resource performance with a common folder structure. However, those logs are now being updated on each local cache differently, so at BEST DFS Replication can maintain each machines information in a contiguous block, while also making the files utterly gigantic. At worst either the blocks are getting co-mingled, or local log files are being overwritten completely by the logs from the “home” office, making troubleshooting much more difficult. I believe DFSR will only overwrite common files, so having independent logs for every machine should ensure nothing gets overwritten and logs are available when you need them. That said, I don’t have DFS configured in my little test environment, so I would love to hear from folks how the default behavior actually works, and what if any value this approach actually provides. Anyone willing to be a guinea pig for 2021?

One last note. Anyone using Px Tools to manage their software may be tempted to revise their Settings.xml file to put the PX Tools log in this same machine name folder. I had the same thought, but I quickly discovered that it is better to have all the Px Tools logs in a single folder where I can quickly see which Conforms had errors, then review those logs to see what the errors where. Only when the Px Tools log points at a Deployment Install being the source of trouble do I need to go review the Autodesk logs, and at that point this new approach demonstrates its value. Putting the Px Tools log in the ADSK log folder just makes that initial review much more difficult, so I don’t recommend it.

Anyway… I hope that was useful for some folks, or at least educational. I know I am super happy with this approach so far this year!

Stay healthy, and have a great day!

Comments