We have functionality to draw points (1 vertex), lines (2 vertices), triangles (3 vertices).
Why don't we have a possibility to draw tetrahedron (4 vertices)?
4 vertices can be projected to screen as usual. And then in the zone inside of them (in screen space) the device can check if _the existing values in depth buffer_ are located inside tetrahedron (in 3D). If they are inside -> the fragment shader will run. With interpolated values from tetrahedron vertices (volume interpolation).
This will add possibility to approximate 3D functions with vertices (not in regular grid). Now we have 3D textures for that.
For example, we can create volumetric light sources and dimming sources, which interact with existing geometry via depth buffer. We can take a mesh and create extruded faces using normals, this crust can be filled with tetrahedrons (base mesh and extruded mesh). Then some values can be interpolated based on distance from the surface of the mesh (light intensity for example).