we've been doing general purpose computing on gpus since the 90s, nobody needs to "disrupt" anything. stop trying to make everything a bloody gpu-accelerated neural network and just write a decent algorithm for once.