TPU – Tensor Processing Unit
GPU – Graphics Processing Unit
CPU – Central Processing Unit
This is basically evolution of hardware, progressing hand-in-hand with software.
When software requires to be computed faster, hardware is created. Basically, in case of GPUs and TPUs, the usecase of images and tensorflow computation fits respectively.
When hardware is invented or improved, software is created. My assumption is that, people slowly tend to start writing parallel programs because of the supporting hardware.
TPUs have been created exclusively for tensorflow computations, the ones which are performed in runtime of tensorflow library created by Google.
So, TPUs are specialized hardware circuitry. They are AI accelerators.
AI accelerators are specialized hardwares which supports AI algorithms, which deal with low precision arithmetic.