You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am running lima on PacBio MAS-seq long-read scRNA-seq data. This is the command I used:
lima /home/jiayiwang/loggedfs_david_penton/data/kinnex/segmented/PB258_3_r0362_bcM0004-segmented.bam data/primer/10x_3kit_primers.fasta /home/jiayiwang/loggedfs_david_penton/result/preprocessing/primer_removal/PB258_3_r0362_bcM0004.fl.5p--3p.bam --no-reports --num-threads 4 --isoseq
This command works for my other samples/bams. However, it returns error for this specific sample:
terminate called after throwing an instance of 'std::length_error'
what(): cannot create std::vector larger than max_size()
Aborted (core dumped)
I am not sure what this error means. Is the number of reads too large? My bam file is 67GB. I tried --split-bam, however, the error persists. It will be great if you could help me with this.
Thanks very much in advance!
The text was updated successfully, but these errors were encountered:
I am having the same problem with a hifi_read.bam file, with the command lima m84132_250124_184858_s3.hifi_reads.bam old-16S-primers.fasta Kinnex-m84132.asym.demux.bam --split --peek-guess --hifi-preset ASYMMETRIC
My bam file is approximately the same size (63GB) and I'm wondering if the file size is part of the problem since I've been able to process smaller read files successfully.
Hi, I am running lima on PacBio MAS-seq long-read scRNA-seq data. This is the command I used:
lima /home/jiayiwang/loggedfs_david_penton/data/kinnex/segmented/PB258_3_r0362_bcM0004-segmented.bam data/primer/10x_3kit_primers.fasta /home/jiayiwang/loggedfs_david_penton/result/preprocessing/primer_removal/PB258_3_r0362_bcM0004.fl.5p--3p.bam --no-reports --num-threads 4 --isoseq
This command works for my other samples/bams. However, it returns error for this specific sample:
I am not sure what this error means. Is the number of reads too large? My bam file is 67GB. I tried
--split-bam
, however, the error persists. It will be great if you could help me with this.Thanks very much in advance!
The text was updated successfully, but these errors were encountered: