High performance logic has long moved to very low voltages (sub 1V) demanded by nanoscale fabrication technology. But a lot of systems still use 3.3V and 5V signal interface. A common problem is interfacing between 3.3V and 5V. A signal from 3.3V can usually drive the 5V: the TTL minium VIH is 2V, so there is no problem; however, the CMOS is a little borderline, with VIH is usually 0.7*VCC = 3.5V; if the speed is relatively slow, 3.3V can switch correctly.
Going the other way is a little more problematic. The problem is usually the ESD protection diodes; they will turn on when driven by a 5V signal in a 3.3V supply voltage. If the drive is robust, it can destroy the protection diode and damage the module. The safe current limit is usually about 10mA. So we can place a resistor in series to limit the current. 10mA is actually a lot of current, if the speed is not of concern, the current can be a lot lower, 1-2mA. The resistance is (5-3.3-0.5)/2m = 600Ohms to 1200Ohms; 1K is a good value. Sometimes the driver is weak and the output voltage drops quickly when starting sourcing a few milli-amps; it can be connected directly. However, it is better to be safe. Other scheme is to use a diode in series, some even suggested using an LED which has a turn-on voltage of 1.7V. But a resistor has to be used to either pull up or pull down.
Another issue is the voltage supply. We can start with 5V and drop to 3.3V with an LDO. Sometimes the system is predominantly 3.3V with only very low current requirement for 5V. In such a case, a simple charge pump circuit to generate 5V from 3.3V is actually convenient. Here is the circuit diagram,
an the constructed board (also including a series resistor for signal interface),
It supplies a few milliamps quite nicely.