You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all, thanks for this exporter which works great with my local backups. I really enjoy seeing the detailed insights that come with this software and the dashboard.
However, I have also set up the exporter with a repository that lives in Google Cloud Storage, and to my surprise, a lot of data was fetched from the bucket. The backup isn't huge, it's about 15GiB of smaller files, and it usually costs me under € 1/ month to store it. Since I had the restic exporter running and in my prometheus (with a scrape interval of 60s), it constantly downloaded metadata off GCP, which amounted to about 100GiB or € 10 of traffic per day.
I know that it is my responsibility to check the cost of my infrastructure, and I have since deactivated the exporter for my Google Cloud repository. Still, I would like to report it here for two reasons:
Could you add more documentation to the readme explaining which interactions happen with the backup repository and that high costs can occur on cloud/pay-per-download offerings?
At 100GiB/d which i was billed from Google, each minutely export would have consumed 75MiB, or almost 10Megabit/s at all times. That seems excessive. Maybe, some bad/inefficient functions have been called? I don't know enough about the code behind restic and restic-exporter to optimise here, but maybe someone else has?
Thank you again for your project and I'm thankful for any discussion!!
The text was updated successfully, but these errors were encountered:
Hi!
First of all, thanks for this exporter which works great with my local backups. I really enjoy seeing the detailed insights that come with this software and the dashboard.
However, I have also set up the exporter with a repository that lives in Google Cloud Storage, and to my surprise, a lot of data was fetched from the bucket. The backup isn't huge, it's about 15GiB of smaller files, and it usually costs me under € 1/ month to store it. Since I had the restic exporter running and in my prometheus (with a scrape interval of 60s), it constantly downloaded metadata off GCP, which amounted to about 100GiB or € 10 of traffic per day.
I know that it is my responsibility to check the cost of my infrastructure, and I have since deactivated the exporter for my Google Cloud repository. Still, I would like to report it here for two reasons:
Thank you again for your project and I'm thankful for any discussion!!
The text was updated successfully, but these errors were encountered: