The sync should still work... are you getting an error? If you go from Otter -> BuildMaster, then it just won't copy over the role dependency relationships, but you should still have the roles and role variables, etc.
Posts made by atripp
-
RE: Infrastructure Sync Otter => BuildMaster > Dependant roles missing
-
RE: How to upgrade BuildMaster from 5.8.3 to latest version 6.2
My guess is that you also have SQL Server 2005 installed? It's possible to have multiple instances of SQL Server installed, and an instance named
INEDO
is installed by default w/ the installer. you'll want to upgrade that instance. -
RE: Infrastructure Sync Otter => BuildMaster > Dependant roles missing
BuildMaster doesn't support role dependencies; that's an Otter feature only.
-
RE: How to upgrade BuildMaster from 5.8.3 to latest version 6.2
The upgrade process itself is quite easy. But do note that for v6 you'll need to also update extensions; the admin UI will guide you on doing this. It's documented here: https://inedo.com/support/kb/1163/buildmaster-6-1-upgrade-notes
You can just download the Inedo Hub, then click Upgrade; you won't be able to select 6.2 from the list of versions to upgrade, so just install the latest 6.1 (which will be the default).
-
RE: Otter and ansible
Neat! Would you mind sharing it?
We are trying to build up content libraries that show you to do stuff like this... such as this BuildMaster and Terraform content, which does use Modules, but also establishes a nice CI/CD pattern.
Getting an idea of how to do this w/ playbooks would be nice :)
-
RE: How to upgrade BuildMaster from 5.8.3 to latest version 6.2
Hello; BuildMaster 6.2 is a "really big" upgrade (perhaps, "biggest ever"), so please take care when upgrading. Most users have had no problems.
First place I'd start is here:
https://docs.inedo.com/docs/buildmaster/installation-and-maintenance/legacy-features/62-migrationHere's a more detailed info about the upgrade: https://inedo.com/support/kb/1766/buildmaster-6-2-upgrade-notes
Long story short, just upgrade to v6.1 first, make sure you're not using legacy features, then you should be ready to go to BuildMaster 6.2 :)
-
RE: Proget api for clear-cache
This isn't supported through an API; you could, however, just set-up a retention policy that automatically deletes the cache as space is needed.
-
RE: Restricting API access to View/Download
Hello;
The Native API is for low, system-level functions, and it's "all or nothing". If you give someone access to Native API, you are effectively making them an administrator, as they can also change permissions and grant admin privileges. So, I don't think you want this. Instead, you'll want to use the Debian API endpoint that we implement.
It's a third-party API format
In order to support third-party package formats types like NuGet, npm, etc., ProGet implements a variety of third-party APIs. We only provide minimal documentation for these APIs, as they are generally either already documented elsewhere. However, you can generally find the basics by searching for specific things you'd like to do with the API, such as "how to search for packages using the NuGet API" or "how to publish an npm package using the API".
So in this case, I recommend to search "how to view and download apt packages".
-
RE: Cannot pip install from PyPI feed connected to another feed
Thanks, just confirming it was received, and I've attached it to the internal ticket. Stay tuned!
-
RE: Cannot pip install from PyPI feed connected to another feed
Hello; we transfer would be great. I can download it, and attach to the internal ticket for this forum post! I don't have a local python environment, so best to just get files and manually upload.
-
RE: SqlException on Proget 5.2.28 (docker version)
Can you try the 5.2.29 docker image? There seems to have been a configuration error.
-
RE: Otter and ansible
Thanks!! We'd love to learn more.
By the way, With Otter, we have a general plan to make UPack-based "rafts", and allow users to dowload them from a feed. This way, we can make a community feed of rafts. Easier to build and work-with than extensions, I think.
-
RE: Otter and ansible
We do have a lot of users who configure Linux servers, but their usage doesn't seem much more involved than ensuring packages, files, and directories. You may be ablet o get a lot accomplished with just doing that? I'm not sure... happy to learn and help though!
-
RE: Cannot pip install from PyPI feed connected to another feed
The first error is a known issue, it's related to browsing for packages in the UI, and it's something we've already addressed. I'm not sure about the second error.
As for the package, can you provide a package file that we can actually upload to a test instance of ProGet?
Basically I want to take a package file, then upload it from the UI, then try to download it from the UI. By not involving Python tools, we can eliminate a lot of problems and simplify finding solution.
-
RE: S3 Store with deny public policy set fails
Ah, thank you very much for the additional information. So, then it seems like setting it to
AuthenticatedUsers
is a bug. OK, that makes sense. So, I changed it.Can you try it and let me know if it's going to work?
Instructions on installation of new extension: https://docs.inedo.com/docs/proget/administration/extensions#manual-install
Pre-release of AWS Extension: https://proget.inedo.com/feeds/PrereleaseExtensions/inedox/AWS/1.0.4-RC.3
Then, if it's ok, we can release it.
Thanks.
Alana -
RE: Cannot pip install from PyPI feed connected to another feed
Hello;
A 500 error should be logged in ADmin > Diagnostic Center, so if you can check there and find it, that should help us identify what could be causing it.
Is it only that package, or all packages? If it seems to be package-specific, please share the package to us (or a version that still breaks, but has sensitive information removed), then we can try to reproduce the bug and fix it.
-
RE: S3 Store with deny public policy set fails
Hi Adam,
I'm not so familiar with the intricacies of ACL; they are quite complicated to me... so hopefully you can help me to understand, and I'll be able to then explain a change request
We try to keep our code for this really simple. We have a property called
CannedACL
that works like this:private S3CannedACL CannedACL => this.MakePublic ? S3CannedACL.PublicRead : S3CannedACL.AuthenticatedRead;
This property references an S3CannedACL, which is defined by AWS, and
MakePublic
is what you check in the UI. MakePublic sets PublicRead, if not AuthenticatedRead.That
CannedACL
is then used when creating objects:await client.PutObjectAsync(new PutObjectRequest { BucketName = this.outer.BucketName, Key = this.key, StorageClass = this.outer.StorageClass, CannedACL = this.outer.CannedACL, ServerSideEncryptionMethod = this.outer.EncryptionMethod, AutoCloseStream = false, InputStream = this.inner }).ConfigureAwait(false);
So I think you're asking is, to use a different
CannedACL
? Perhaps for MakePublic we could not using a checkbox, but a drop down list? What do you think the values ought to be? -
RE: Data migration from old PostgreSQL to new SQL server database
Hello;
Here is a guide on how to migrate packages in feeds: https://inedo.com/support/kb/1168/proget-feed-migration
As far as users and API keys, we don't have a supported migration path for those. However, many users have simply "copied" the data from one database table to another. The tables are Users, Groups, UserGroups, and ApiKeys.
IF the configured encryption key in both instances is the same, it will still encrypt/decrypt fine.
Best.
Alana -
RE: [Question - ProGet] Are versions amount wrong ?
Another 5.3 screenshot!
When you enable "strict" (SemVer2) versioning, you'll see how "virtual" get added -- they get automatically generated when you add/remove tags. The feed also restricts adding anything but semver2 tags.
-
RE: [BUG - ProGet] Not able to remove container description
I can confirm this works quite well in 5.3; we hope to have a pre-release this week :)
Here's a sneak peak;
-
RE: Cannot pull from another repository
I would try a couple other things, to first eliminate the conditions.
First, how about pulling from a specific connector? Navigate to the version of the package you want, like
/feeds/feed-name/adobereader/2020.006.20042
, and then click "Pull to ProGet". That way you'll get a specific connector and a specific version.If the individual packages work, but the "Add Package" route you took doesn't, then perhaps it's some sort of Mono-related bug. Those are hard to track down, and they come/go as the mono docker image is patched. We will eventually address this by moving to a more stable, .NET core version, once it's released by Microsoft.
Try some other packages, to see if they also don't work. If some packages work, but others don't, it's probably related to your gateway/router blocking things that it sees as security risks.
If that still doesn't work, then consider to attach ProGet to a Proxy server, like a Fiddler instance, by going to Admin > Proxy. You can then monitor the outbound traffic.
-
RE: Deletion of certain docker images fail
Hello;
Thank you for the bug report; there was a bad query in that stored procedure that would cause it to fail in some cases.
It will be fixed in the next maintence release of ProGet (5.2.28), but you can download a patch (
1.DockerImages_DeleteImage.sql
) and run it against your database today :)The file is attached to the ticket: https://inedo.myjetbrains.com/youtrack/issue/PG-1684
Best,
Alana -
RE: Support for R and CRAN
Thanks @miles-waller_2091 ; we're heads-down in 5.3 now, so after that we'll be able to resume investigating this. Perhaps May/June? In the meantime, if more people can volunteer to help test, we'll be able to get this going rather quickly :)
-
RE: Credentials_CreateOrUpdateCredential
Unfortunately we don't yet have an API for the credentials, but it's something we'd like to make. In the mean time, the Native API will work. If you look in the database, you'll be able to see how credentials are structured, and how things like
Password
are stored.The secret fields are encrypted using DPAPI, with the Encryption key stored in the configuration file.
Here's the specific code we use to encrypt/decrypt. Please share what you come up with, would definitely help out in the mean time :)
private static byte[] Decrypt(byte[] data) { if (protectedAesKey == null || protectedAesKey.Length == 0) throw new InvalidOperationException("Cannot decrypt persistent property; decryption key not configured."); byte[] key; try { key = ProtectedData.Unprotect(protectedAesKey, null, DataProtectionScope.LocalMachine); } catch (CryptographicException ex) { throw new InvalidOperationException( $"An error occurred during decryption (\"{ex.Message}\"). This usually means that the encryption key has changed between" + " encrypting and decrypting the data, which might happen if you accidentally overwrite a configuration setting, perhaps during an upgrade or reinstall." + " Check your configured encryption key, and restart the service and web application(s) as needed."); } try { var nonce = new byte[16]; Array.Copy(data, 0, nonce, 0, 8); Array.Copy(data, data.Length - 8, nonce, 8, 8); using (var buffer = new MemoryStream(data.Length - 16)) { buffer.Write(data, 8, data.Length - 16); buffer.Position = 0; using (var aes = new AesManaged { Key = key, IV = nonce, Padding = PaddingMode.PKCS7 }) using (var cryptoStream = new CryptoStream(buffer, aes.CreateDecryptor(), CryptoStreamMode.Read)) { var output = new byte[SlimBinaryFormatter.ReadLength(cryptoStream)]; cryptoStream.Read(output, 0, output.Length); return output; } } } finally { if (key != null) Array.Clear(key, 0, key.Length); } } private static byte[] Encrypt(byte[] data) { if (protectedAesKey == null || protectedAesKey.Length == 0) return null; var key = ProtectedData.Unprotect(protectedAesKey, null, DataProtectionScope.LocalMachine); try { using (var aes = new AesManaged { Key = key, Padding = PaddingMode.PKCS7 }) { aes.GenerateIV(); using (var outputBuffer = new MemoryStream()) { outputBuffer.Write(aes.IV, 0, 8); using (var cryptoStream = new CryptoStream(new UncloseableStream(outputBuffer), aes.CreateEncryptor(), CryptoStreamMode.Write)) { SlimBinaryFormatter.WriteLength(cryptoStream, data.Length); cryptoStream.Write(data, 0, data.Length); } outputBuffer.Write(aes.IV, 8, 8); return outputBuffer.ToArray(); } } } finally { if (key != null) Array.Clear(key, 0, key.Length); } }
-
RE: BuildMaster Extension updates and Windows Agents
Hello; this is unusual.
But in this case, restarting the service should have also resolved the error. That will trigger the Agent Update Checker, which does restart the agents.
Most likely, a sort of load error happened on the agent; hard to say what without looking at logs. It's almost always an anti-virus or some sort of program locking files at the wrong time. If it keeps happening, it might be worth investigating further.
-
RE: Cannot pull from another repository
Hello; this error basically means that ProGet is unable to talk to Chocolatey.org for a package download request. The most likely cause of this is a sort of firewall/proxy that's blocking the package from being downloaded. It's also possible that you've been request-throttled from Chocolatey.org.
-
RE: SECURITY VULNERABILITY: nuget cli requires anonymous access to feed
I'm sorry that my suggestion offended you.
ProGet doesn't "prefer" anything. It's a standard, authenticated feed. If it works in your browser (i.e. the feed endpoint URL), then it should work in NuGet. if it doesn't work in NuGet, then NuGet isn't being configured correctly.
"Behind the scenes", NuGet uses "basic authentication" to transmit those credentials to ProGet. It's the same mechanism your browser uses when you navigate to the feed API (i.e. a pop-up).
Credentials are stored in the
nuget.config
; the link I provided shows you how to edit that configuration file. You can either edit it yourself or use thesources
command-line argument.https://docs.microsoft.com/en-us/nuget/reference/nuget-config-file#packagesourcecredentials
You can attach a tool like Fiddler to NuGet.exe, and see what it's sending in different cases. It might be a NuGet bug.
-
RE: SECURITY VULNERABILITY: nuget cli requires anonymous access to feed
Those posts are over 5 years old, and nuget.exe has made a lot of improvements since then. Back then, it wasn't easy to authenticate to private feeds.
So, if you follow the article I linked, by Microsoft, it will help you configure nuget.exe to talk to an authenticated feed in ProGet. If you're continuing to have trouble, the problem is not of ProGet, but of some sort of nuget.exe configuration. Your best bet will to do a wide search, like "NuGet.exe prompting for credentials".
-
RE: SECURITY VULNERABILITY: nuget cli requires anonymous access to feed
Hi; I'm not really sure what the issue is?
NuGet will prompt for credentials if it's not an anonymous feed, that's by design. Here is information on how to store credentials with nuget.exe: https://docs.microsoft.com/en-us/nuget/consume-packages/consuming-packages-authenticated-feeds
Once credentials are stored, then you won't be prompted again. That's also by design of nuget.
-
RE: Proget: docker login returns unauthorized
It's really hard to say. This doesn't seem to be impacting others... we can't repro it... and I'm not sure what else it could be.
If you can get us some more very specific details about it, perhaps using some sort of fidler trace, then we can try to investigate further.
-
RE: RPM upload fails - 42P01: missing FROM-clause entry for table "rpv"
Is there an error in ProGet > Admin > Diagnostic Center that corresponds to this? Can you provide the full stack trace?
-
RE: Update failed to Proget 5.1.23
Hello;
It's a problem with the database... there was probably some failed change script, a while ago. But beyond that diagnosing/troubleshooting isn't going to be trivial and not worth the time, especially since... Postgres has been deprecated in 5.2. It won't be available in 5.3.
So, I would recommend to take the opportunity to first move to SQL Server (i.e. install a new instance w/ same version that uses SQL Server). You can use this guide to migrate feeds: https://inedo.com/support/kb/1168/proget-feed-migration
Then, you can perform the upgrade to 5.2 on the new instance.
You can migrate feed-by-feed. -
RE: `dotnet nuget push --skip-duplicate` does not work as expected
We have a new major release coming up, 5.3. So perhaps, we'll just change the response there.
And if it actually makes a problem, and others have a trouble updating their own scripts, then we can consider adding a flag.
-
RE: $PSCredential- round two
@Jonathan-Engstrom said in $PSCredential- round two:
Same code, same machine.
Same user?
@Jonathan-Engstrom said in $PSCredential- round two:
it would help to understand why.
FYI -- Otter doesn't format PowerShell; it parses the PowerShell script (using MIcrosoft's parser), looks for variable tokens, and "injects" a variable into the runtime if there's a matching variable.
The interactive Powershell Host (which you're using,
ps.exe
) also does things differently. There's a ton of layers-upon-layers with active directory. so it'll take some trial/error to find out what's happening. -
RE: `dotnet nuget push --skip-duplicate` does not work as expected
Overwriting packages in NuGet.org is totally impossible, but it's possible in ProGet if you have the correct privileges. This is why ProGet returns a
401
(not authorized). We're a bit skeptical to change, but will consider it in a major release.To be honest, this flag on
dotnet nuget push
doesn't make much sense, even for NuGet.org. What's the usecase? What sort of Build process is used that doesn't generate new package numbers on build?We don't like encouraging poor workflows. For example, this is why we don't have a "easy package delete" function; this isn't something you should be doing very often that you'd need a quick-delete, and if you are, your'e probably doing it wrong.
So hopefully you can help us understand this workflow and why it would make sense to use?
-
RE: NPM Connector to Azure DevOps
Unfortunately we don't have any documentation specifically for Azure DevOps NPM feeds; they change fairly often for us to to keep track. We did try/test it at one point, a while back, but our code for this feature hasn't substantially changed since then.
It's supposed to be as simple as an empty username (ENAME) and a token as your password (PAT). That's used to request a Bearer token from the NPM api, and send that back in a header.
I looked at their docs, and it says "username (can be anything except empty), PAT, and email". Not sure why they require username. Do they look you up by email? Weird.
Anyways, that's strange. So I guess, I'd also try EMAIL + PAT. That should
also work. -
RE: Getting windows service status into variable
Great to hear!
Well, of course, you could store the information about which services are active in BuildMaster. I would imagine it's not something that changes very often, so perhaps this is OK?
One approach might be to make a server role called
hdars-service
and then add a variable to that role called$ActiveServer
with the server name. You would set a pipeline target to rundeploy-hdars
tohdards-service
role, and in that plan, just do aif $ServerName == $ActiveServer
to decide whether to start/stop service, or whatever.Otherwise, I think using powershell of some sort, whether
$PSEval
or a making an OtterSCript module with an output variable. -
RE: Nuget install prompting for credentials
The first thing that comes to mind is Integrated Windows Authentication; you often won't be able to authenticate unless your computer is on the domain. This is a limitatino/feature of Windows (i.e. it's by design to work this way).
As a work-around, you can setup a second site that doesn't have IWS enabled, and point to the same folder on disk.
-
RE: NPM Connector to Azure DevOps
Hello;
I'm not really familiar with how Azure DevOps implements npm authentication. but it seems there's definitely a problem. The endpoint is not returning JSON, and it's giving a 400 error; my guess is that it's an empty response, so who knows what the error message is. I wonder, if the URL is wrong? Are there AzureDevOps logs you can inspect?
Otherwise it's hard to say where the problem is with Azure DevOps because the behavior changes all the time. In the past, I've seen 400 errors come and go, and sometimes it's their way of telling you that you've configured a token wrong. Or, they have a bug implementing NPM's api (this is common), so maybe try a different request? Both errors are happening while trying to index the connector. Try searching, etc.
In any case, the easiest thing to do would be to replicate the way ProGet does authentication. Here's how we handle it in ProGet; first, the logic to determine if a bearer token should be used...
protected override async Task<HttpWebRequest> CreateWebRequestAsync(string url) { var request = await base.CreateWebRequestAsync(url).ConfigureAwait(false); if (this.Password != null && (string.IsNullOrEmpty(this.UserName) || this.UserName.Contains('@'))) { var bearerToken = await this.BearerAuthToken.ValueAsync.ConfigureAwait(false); if (bearerToken != null) { request.Headers.Set(HttpRequestHeader.Authorization, "Bearer " + bearerToken); } } request.Accept = "application/json"; return request; }
And the bearer-token acquisition logic....
private async Task<string> RequestBearerAuthTokenAsync() { if (string.IsNullOrEmpty(this.UserName)) { return AH.Unprotect(this.Password); } var request = await base.CreateWebRequestAsync(this.ResolveUrl("-/user/org.couchdb.user:" + Uri.EscapeDataString(this.UserName))).ConfigureAwait(false); request.Accept = "application/json"; request.ContentType = "application/json"; request.Method = "PUT"; using (var requestStream = await request.GetRequestStreamAsync().ConfigureAwait(false)) using (var writer = new StreamWriter(requestStream, InedoLib.UTF8Encoding)) using (var jsonWriter = new JsonTextWriter(writer)) { jsonWriter.WriteStartObject(); jsonWriter.WritePropertyName("_id"); jsonWriter.WriteValue("org.couchdb.user:" + this.UserName); jsonWriter.WritePropertyName("name"); jsonWriter.WriteValue(this.UserName); jsonWriter.WritePropertyName("password"); jsonWriter.WriteValue(AH.Unprotect(this.Password)); jsonWriter.WritePropertyName("email"); jsonWriter.WriteValue(this.UserName); jsonWriter.WritePropertyName("type"); jsonWriter.WriteValue("user"); jsonWriter.WritePropertyName("roles"); jsonWriter.WriteStartArray(); jsonWriter.WriteEndArray(); jsonWriter.WritePropertyName("date"); jsonWriter.WriteValue(DateTimeOffset.UtcNow); jsonWriter.WriteEndObject(); } using (var response = await request.GetResponseAsync().ConfigureAwait(false)) using (var responseStream = response.GetResponseStream()) using (var reader = new StreamReader(responseStream, InedoLib.UTF8Encoding)) using (var jsonReader = new JsonTextReader(reader)) { var result = JObject.Load(jsonReader); return (string)result.Property("token")?.Value; } }
It doesn't look like the error is happening when requesting a token, just when doing the request. hope this is a good start at least!
-
RE: Feature Request: Please add inside each Role which servers have drifted, and which ones are compliant.
Thanks for tracking those down, I updated owner on them :)
-
RE: Feature Request: Please add inside each Role which servers have drifted, and which ones are compliant.
Yes; this was a result of the forums migration. If you find other posts we can change the owner.
-
RE: Simplify running script assets in Otter Configurations (PSEnsure)
It's designed to download a file to disk; so naming aside (pretend it's called
Write-Asset
), it's still writing a file to disk, and thus it won't run in the collect phase.As I mentioned, the feature wasn't designed to behave like you'd like it; most users will create/publish PowerShell module packages.
What your describing is nontrivial, from a design perspective, and we'd like to really consider it for the next major release of Otter, which is going to be focusing on things like "Compliance as Code". Based on some examples you shared, and how I've seen your Otter usage, I think you'll find those a lot more useful. We'll be exploring this mid-Q2.
In the meantime, I recommend you to just maintain the assets on disk somehow, and import them that way.
-
RE: WinRM issues
There haven't been any other feature requests for this, and we haven't researched the feasibility of doing it; however we did add Impersonation and Process Isolation, both which were in-demand features.
-
RE: API Endpoint URL Errors: Could not establish trust relationship for the SSL/TLS secure channel.
I don't know if it will be sufficient to provide access to only http://files.pythonhosted.org/ and https://pypi.org/ - it's very possible that the file hosting locations will change. This is the case on a lot of other package galleries, including NuGet.org.
If ProGet can only download a portion of the files, then, there will probably be some strange errors. It's too difficult to generalize, so if you can provide us with a specific package and a specific reproduction situation, we'll be happy to try it.
But, as the error says, ProGet does only support those wheel/source packages; the legacy "egg" format (10+ years old I think) is not supported. Your developers should be able to help convert it, and you can rehost the handful of "egg" packages as needed.
As far as other errors you may see.... don't think of the "Diagnostic Center" as "checklist of things to fix", it's there to help diagnose a problem... and unless you have one reported by a user, it's probably fine. They can come from so many sources (including temporary network outages, users typing in wrong urls / passwords, etc).
-
RE: API Endpoint URL Errors: Could not establish trust relationship for the SSL/TLS secure channel.
This error means that the package file data (i.e. what is being returned by the URL that ProGet is instructed to download from) is invalid. So I'm thinking some sort of intermediary is blocking/rewriting these requests.
Sometimes, I see firewalls / proxies inspecting the contents, and then displaying "this content is blocked by corporate firewall" instead. The proxy should produce an error, but sometimes it's just 200. So, ProGet expects package data, but instead gets random HTML.
You should be able to do this.
- Create PyPi Feed
- Add Connector to PyPi.org
- Pull package (I used
girth
since it was at the top of the list)
If that's not working, then something is blocking the download from pypi.org.
-
RE: Proget: docker login returns unauthorized
Hello; this should get resolved in PG-1676, which is scheduled for the next maintence release.
-
RE: docker pull from proget not working
Hello; this should get resolved in PG-1676, which is scheduled for the next maintence release.
-
RE: Simplify running script assets in Otter Configurations (PSEnsure)
@Jonathan-Engstrom said in Simplify running script assets in Otter Configurations (PSEnsure):
I am not sure why they are so different, or would be designed in such a manner to disallow this to work.
The collect pass is "read only", and isn't supposed to change any server configuration at all. A
Get-Asset
operation always changes configuration (it always puts a file on disk), so by design, it doesn't run during the collect-pass.I know a lot of people use PowerShell modules packages (i.e. things you install on a server), but I can see why using assets for this would be nice. It was never a design we had considered, which is why we'd need to modify the operations to do an automatic import.
I guess the only alternative i think of now, is to store the assets on disk, and use a sort of Ensure-Asset in another plan to do that. Then refer to it as like `c:\modulestore\myasset.ps1' or something
-
RE: Proget Docker Nuget creating extra empty folder with different case.
While this is sub-optimal behavior, it's the first report of this issue/bug, and the impact seems relatively small; tracking it down and fixing it will may not be trivial, and might even introduce a regression.
Prior to back-up, how about running a script that just deletes the empty folders? If we get more reports of this, or hear about a wider impact, we'll absolutely investigate further.