can you explain to me why a voltage divider isn't an ideal voltage source? i get that it would consume way more current because like half of it is going to go to ground. but beyond that, i dunno. thanks, man (not to push things off-topic...)
The voltage divider has a resistor to ground, and this is actually in parallel with your whole device. So, if the resistance of the device changes, the resistance in the voltage divider is changing, and so it'll change the supply voltage.
Things like voltage regulators are self-correcting, in that they will maintain a constant output voltage, where as a voltage divider depends on the resistance of the load to not change. Also, if you set the voltage divider to exactly 5v, unloaded, and now connect a device, it'll no longer be delivering 5v. And what it will deliver will vary from device to device