You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Recently I've rewritten Yota's tracer from Cassette to JuliaInterpreter which I found more flexible and much easier to debug. However, the cost of this move is extreme slowdown - a simple neural network which usually takes ~1 second now takes almost 10 minutes in tracer, and most of this time is due to JuliaInterpreter (I can provide exact code and timings if needed).
I see that a lot of work is being done to improve the situation, but in the end what slowdown should we expect? Is it 5x, 10x or 100x compared to runtime of normal call (including compilation time to make it fair)?
The text was updated successfully, but these errors were encountered:
We are at a point where the current strategy doesn't have any real low hanging fruits left (at least not at the order of magnitude level). More drastic things like #309 will likely improve the situation but it is not clear by how much and if (or at all) that will ever be implemented.
Recently I've rewritten Yota's tracer from Cassette to JuliaInterpreter which I found more flexible and much easier to debug. However, the cost of this move is extreme slowdown - a simple neural network which usually takes ~1 second now takes almost 10 minutes in tracer, and most of this time is due to JuliaInterpreter (I can provide exact code and timings if needed).
I see that a lot of work is being done to improve the situation, but in the end what slowdown should we expect? Is it 5x, 10x or 100x compared to runtime of normal call (including compilation time to make it fair)?
The text was updated successfully, but these errors were encountered: