You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I'm uploading a GO enrichment table and using the mouse background table from the example data.
For one experiment,
When I use the first 700 entries, it works; then I submitted only entries 701-1000, and it works as well.
However when I submit the combined 1-1000 entries, it gives error "Something went wrong. Please check your input files!"
For this experiment in particular I have about 3000 entries.
I tried another experiment where I have 600 entries, but got the same error as above.
As the 1000 entries can be processed when broken down, I believe my data format is correct.
Is there a cap for the entry size? Does it depend on my system or the server capasity? Is there a way to increase the cap?
Thank you!
The text was updated successfully, but these errors were encountered:
In addition, when the tool does return a result page, I am unable to download the plots - click on the download icon only slows down the webpage and after some time the browser suggests the page is taking too much resource and should be killed.
Also, I found that do not check "Propagate background" usually improves the performance. However, I am not sure whether that's recommanded with the example background.
I'm using Win10, 4 core i5 processor, 16G mem. I tried Chrome, Firefox and Edge, virtually same behavior. I monitored the resource consumption using task manager, it never reached more than 70% CPU or memory usage.
Hello,
I'm uploading a GO enrichment table and using the mouse background table from the example data.
For one experiment,
When I use the first 700 entries, it works; then I submitted only entries 701-1000, and it works as well.
However when I submit the combined 1-1000 entries, it gives error "Something went wrong. Please check your input files!"
For this experiment in particular I have about 3000 entries.
I tried another experiment where I have 600 entries, but got the same error as above.
As the 1000 entries can be processed when broken down, I believe my data format is correct.
Is there a cap for the entry size? Does it depend on my system or the server capasity? Is there a way to increase the cap?
Thank you!
The text was updated successfully, but these errors were encountered: