Skip to content

Dump and CSV file #841

@WilliJoin

Description

@WilliJoin
  1. Sqlmap overwrite previous dumped table if start the same scan again, needed a switch which let deside user verwrite or not
  2. If i dump table for example named "test" where are 70.000 entries and start dump the same table or use --dump-all, sqlmap got stucked on this table for long time, needed to skip already full dumped tables, and if a table not full dumped then start dump from last entry
  3. If OS where running python is restarting, shutdown or get Blue Screen of Death, now imaging what sqlmap runned for dumping 1 tables with 1.000.000 entries and do this for last 5 days, and dump most of them but not all, so when start sqlmap after thoose issues sqlmap cant resume scaning, because sqlmap in this way do everithing from beginning. But session.sqlite seems like contain already dumped info but ..... So needed what sqlmap saving to dump file not to session.sqlite each 10 min and not waiting for full dump to save results to file.

Activity

stamparm

stamparm commented on Sep 30, 2014

@stamparm
Member
  1. can do this
  2. and if a table not full dumped then start dump from last entry -> sqlmap resumes queries stored in session file to minimize number of repeated requests. We can't use "resume from last entry" as sqlmap sees everything as query<->reply
  3. can't write to a dump file before the last row is retrieved because of formatting and filtering.

I can feel your frustration, but writing a partial table dump which would be totally different from the end formatted and filtered output would just start new issues.

mukareste

mukareste commented on Sep 30, 2014

@mukareste

I am trying to figure out a legitimate reason to dump 70K entries. Is this a requirement of some sort of a penetration test?

added this to the 1.0 milestone on Feb 25, 2015
self-assigned this
on Feb 25, 2015
changed the title [-]Extensions[/-] [+]Dump and CSV file[/+] on Feb 25, 2015
ghost

ghost commented on Jun 4, 2016

@ghost
  1. This also happens when extracting specific columns only and restarting. +1 for the overwrite confirmation.
stamparm

stamparm commented on Jun 5, 2016

@stamparm
Member

@lukapusic with latest commit old table dumps are "backup"-ed. For example, after a couple of runs:

$ ll /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv*
-rw-rw-r-- 1 stamparm stamparm 921 Lip  5 12:36 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv
-rw-rw-r-- 1 stamparm stamparm 854 Lip  5 12:34 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv.1
-rw-rw-r-- 1 stamparm stamparm 921 Lip  5 12:36 /home/stamparm/.sqlmap/output/testasp.vulnweb.com/dump/acuforum/users.csv.2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Relationships

None yet

    Development

    No branches or pull requests

      Participants

      @bdamele@stamparm@mukareste@WilliJoin

      Issue actions

        Dump and CSV file · Issue #841 · sqlmapproject/sqlmap