Neural Networks: Representation
Any logical function over binary-valued (0 or 1) inputs x1 and x2 can be (approximately) represented using some neural network. T
The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1). T
A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function. F XOR is actually implemented as NOT XOR.