The nonlinear (state) feedback theory has reached a level of generality where constructive methods (based on control Lyapunov functions) exist for all systems affine in the control and the disturbance. We present both the more widely known results for deterministic systems and very recent results for stochastic (continuous-time) systems. In the stochastic case, we work from scratch, starting with new definitions of equilibrium and input-output stability, new global Lyapunov theorems, to arrive at a new stochastic optimal control paradigm that is a more natural nonlinear extension of the standard LQG/H_2 than the widely studied risk-sensitive problem. Then we propose an adaptive approach to stabilization in the presence of unknown covariance and arrive at globally stabilizing controllers. In the linear case this result extends the "multiplicative noise" results by Wonham and by Willems & Willems to a larger class of systems.