Expected backup exporter perfomance

General discussion of anything Unimus
Post Reply
ptorbjornsson
Posts: 23
Joined: Tue Aug 11, 2020 12:08 pm

Wed Apr 27, 2022 12:29 pm

Hi,

I just got the backup exporter script from GitHub, and have managed to get it up and running.
I was just wondering what the expected performance speed of the script is. During the initial get of devices from Unimus (function getAllDevices), it seems that I am averaging around 1 device per second. At currently over 20k devices that would make the export take over 6 hours (not counting later stages in the process). Is this to be expected, or do I have a bottleneck somewhere?
Unimus is on version 2.2.0-Beta3 and I am running the exporter script on a host with centos 7.
Vik@Unimus
Posts: 100
Joined: Thu Aug 05, 2021 6:35 pm

Wed Apr 27, 2022 6:33 pm

Hello,

We haven't tested the script at this scale, so it is hard to say what could be a bottleneck. With that being said, in my opinion it doesn't look quite right that you see the script processing one device per second on average, but that also depends on how are you evaluating the performance.
We will look at it, perform some internal testing as we can simulate a very large amount of devices and I will then get back to you with our findings and we can then compare the notes and correlate the data.
ptorbjornsson
Posts: 23
Joined: Tue Aug 11, 2020 12:08 pm

Thu Apr 28, 2022 4:22 am

Hi again,

Thank you for looking into it! Looking forward to hearing the results.
ptorbjornsson
Posts: 23
Joined: Tue Aug 11, 2020 12:08 pm

Thu Apr 28, 2022 7:19 am

I managed to improve performance a lot by adding size=50 to the curl for getting devices. Seems like a jq performance issue with parsing the very large JSON response when getting all devices in one go?

Code: Select all

function getAllDevices(){
...
        local contents=$(unimusGet "devices?size=50&page=$page")
...
and then the same for the subsequent function that get the backups.

Edit:
I realised adding size to the curl for getting backups breaks the loop that goes through pages, due to the if statement that checks if there is any data in the page.
This is due to the if statement being slightly different than the one in getAllDevices:

Code: Select all

function getAllBackups(){
...
       if [ $(jq -e '.data | length == 0' <<< $contents) ] >/dev/null; then
                break
        fi
...
If I change it to match the statement in getAllDevices, it works.

Code: Select all

        if ( jq -e '.data | length == 0' <<< $contents ) >/dev/null; then
            break
        fi
Post Reply