The previous examples all packaged time into a discrete sequence of instants, and talk about the state of the system at time 0, time 1, time 2, etc. Many systems are more naturally modelled with a continuous time variable t. Then the state measurement forms a function x(t) of time. Transformation rules can no longer be described in terms of moving from one instant of time to the next; instead we must give a rule for the change of x(t). This leads to a differential equation, which in its simplest form looks like
for some given function f of both time and the state. Together with the initial state , this is known as an initial value problem. In its modern form, dynamical systems theory now incorporates much of the theory of differential equations. As it turns out, many of the basic concepts of dynamical systems have equal importance in any of the areas we have mentioned: symbolic dynamics, discrete-time dynamics, and initial value problems. In this course, we shall focus on symbolic and discrete-time dynamics, with an occasional reference to parallels in differential equations.