Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out Of Memory When Trying To Use Data Tiering #4771

Closed
darsh12 opened this issue Mar 15, 2025 · 1 comment
Closed

Out Of Memory When Trying To Use Data Tiering #4771

darsh12 opened this issue Mar 15, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@darsh12
Copy link

darsh12 commented Mar 15, 2025

Describe the bug
When trying to use data tiering, while setting a lower amount of memory, I get a out of memory error

To Reproduce
Steps to reproduce the behavior:

  1. Start dragonfly without data tiering with max amount of system ram /dragonfly --logtostderr --maxmemory=28GB --dir /mnt/vol1 --dbfilename backup
  2. Populate the db with dummy data with redis-cli debug populate 51111119 npm:10000
  3. Shut down the database
  4. Start the database with tiering enabled and a lower memory /dragonfly --logtostderr --maxmemory=2GB --dir /mnt/vol1 --dbfilename backup --tiered_prefix /mnt/tiered/

Expected behavior
The db should start up

Environment (please complete the following information):

  • OS: ubuntu 24.04.2
  • Kernel: Linux ubuntu-8gb-hel1-1 6.8.0-55-generic 57-Ubuntu SMP PREEMPT_DYNAMIC Wed Feb 12 23:42:21 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
  • Containerized?: no
  • Dragonfly Version: v1.27.2-0e56a09f70de57bfb52dcd438326a65ae20d386b

Additional context

Hi, I am currently trying to get data tiering enabled, but I keep on running on the same issue.

I am following this guide, albeit with limited memory https://www.dragonflydb.io/blog/a-preview-of-dragonfly-ssd-tiering

Initially loaded the database with 51 million keys, that takes up approximately 3GB of memory

root@ubuntu-8gb-hel1-1:~# ./dragonfly ./dragonfly --logtostderr --maxmemory=28GB --dir /mnt/vol1 --dbfilename backup
I20250315 18:21:42.129357  2007 init.cc:78] ./dragonfly running in opt mode.
                   .--::--.
   :+*=:          =@@@@@@@@=          :+*+:
  %@@@@@@%*=.     =@@@@@@@@-     .=*%@@@@@@#
  @@@@@@@@@@@@#+-. .%@@@@#. .-+#@@@@@@@@@@@%
  -@@@@@@@@@@@@@@@@*:#@@#:*@@@@@@@@@@@@@@@@-
    :+*********####-%@%%@%-####********++.
   .%@@@@@@@@@@@@@%:@@@@@@:@@@@@@@@@@@@@@%
   .@@@@@@@@%*+-:   =@@@@=  .:-+*%@@@@@@@%.
     =*+-:           ###*          .:-+*=
                     %@@%
                     *@@*
                     +@@=
                     :##:
                     :@@:
                      @@
                      ..
* Logs will be written to the first available of the following paths:
/tmp/dragonfly.*
./dragonfly.*
* For the available flags type dragonfly [--help | --helpfull]
* Documentation can be found at: https://www.dragonflydb.io/docs
I20250315 18:21:42.129909  2007 dfly_main.cc:744] Starting dragonfly df-v1.27.2-0e56a09f70de57bfb52dcd438326a65ae20d386b
W20250315 18:21:42.130885  2007 dfly_main.cc:805] Got memory limit 28.00GiB, however only 7.16GiB was found.
I20250315 18:21:42.130957  2007 dfly_main.cc:807] Max memory limit is: 28.00GiB
I20250315 18:21:42.135537  2008 uring_proactor.cc:285] IORing with 1024 entries, allocated 98368 bytes, cq_entries is 2048
I20250315 18:21:42.210391  2007 proactor_pool.cc:149] Running 4 io threads
I20250315 18:21:42.216118  2007 server_family.cc:835] Host OS: Linux 6.8.0-55-generic x86_64 with 4 threads
I20250315 18:21:42.216689  2007 snapshot_storage.cc:185] Load snapshot: Searching for snapshot in directory: "/mnt/vol1"
I20250315 18:21:42.217147  2007 server_family.cc:1113] Loading /mnt/vol1/backup-summary.dfs
I20250315 18:21:42.238056  2009 listener_interface.cc:101] sock[11] AcceptServer - listening on port 6379
I20250315 18:22:16.128997  2008 server_family.cc:1153] Load finished, num keys read: 51111119

Attempt to setup data tiering

root@ubuntu-8gb-hel1-1:~# ./dragonfly --logtostderr --maxmemory=2GB --dir /mnt/vol1 --dbfilename backup --tiered_prefix /mnt/tiered/
I20250315 18:24:09.149394  2020 init.cc:78] ./dragonfly running in opt mode.
                   .--::--.
   :+*=:          =@@@@@@@@=          :+*+:
  %@@@@@@%*=.     =@@@@@@@@-     .=*%@@@@@@#
  @@@@@@@@@@@@#+-. .%@@@@#. .-+#@@@@@@@@@@@%
  -@@@@@@@@@@@@@@@@*:#@@#:*@@@@@@@@@@@@@@@@-
    :+*********####-%@%%@%-####********++.
   .%@@@@@@@@@@@@@%:@@@@@@:@@@@@@@@@@@@@@%
   .@@@@@@@@%*+-:   =@@@@=  .:-+*%@@@@@@@%.
     =*+-:           ###*          .:-+*=
                     %@@%
                     *@@*
                     +@@=
                     :##:
                     :@@:
                      @@
                      ..
* Logs will be written to the first available of the following paths:
/tmp/dragonfly.*
./dragonfly.*
* For the available flags type dragonfly [--help | --helpfull]
* Documentation can be found at: https://www.dragonflydb.io/docs
I20250315 18:24:09.149964  2020 dfly_main.cc:744] Starting dragonfly df-v1.27.2-0e56a09f70de57bfb52dcd438326a65ae20d386b
I20250315 18:24:09.150837  2020 dfly_main.cc:807] Max memory limit is: 2.00GiB
I20250315 18:24:09.155282  2021 uring_proactor.cc:285] IORing with 1024 entries, allocated 98368 bytes, cq_entries is 2048
I20250315 18:24:09.229856  2020 proactor_pool.cc:149] Running 4 io threads
I20250315 18:24:09.235793  2020 engine_shard_set.cc:66] max_file_size has not been specified. Deciding myself....
I20250315 18:24:09.235841  2020 engine_shard_set.cc:83] Max file size is: 59.83GiB
I20250315 18:24:09.241425  2020 server_family.cc:835] Host OS: Linux 6.8.0-55-generic x86_64 with 4 threads
I20250315 18:24:09.242559  2020 snapshot_storage.cc:185] Load snapshot: Searching for snapshot in directory: "/mnt/vol1"
I20250315 18:24:09.243332  2020 server_family.cc:1113] Loading /mnt/vol1/backup-summary.dfs
W20250315 18:24:09.244715  2024 rdb_load.cc:2476] Could not load snapshot - its used memory is 2613067840 but the limit is 2147483648
W20250315 18:24:09.244719  2022 rdb_load.cc:2476] Could not load snapshot - its used memory is 2613067840 but the limit is 2147483648
W20250315 18:24:09.244719  2021 rdb_load.cc:2476] Could not load snapshot - its used memory is 2613067840 but the limit is 2147483648
W20250315 18:24:09.245172  2024 rdb_load.cc:2476] Could not load snapshot - its used memory is 2613067840 but the limit is 2147483648
W20250315 18:24:09.245386  2023 rdb_load.cc:2476] Could not load snapshot - its used memory is 2613067840 but the limit is 2147483648
E20250315 18:24:09.248170  2021 server_family.cc:1150] Rdb load failed: Out of memory, or used memory is too high

I have tried many different variation, from changing systems to using tiering prefix path. Any help would be appreciated on whether I am using this feature correctly

@darsh12 darsh12 added the bug Something isn't working label Mar 15, 2025
@romange
Copy link
Collaborator

romange commented Mar 15, 2025

fixed by #4661 , will be available in v1.28

@romange romange closed this as completed Mar 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants