Ok, let's try to clear this up:
"The bandwidth, or the data transfer rate, between the CPU and memory is very small in comparison with the amount of memory." -> The processor is quite slow when compared to the amount of RAM at its disposition.
"Thus programming is basically planning and detailing the enormous traffic of words through the von Neumann bottleneck, and much of that traffic concerns not significant data itself, but where to find it." -> Much data that goes through the processor from and to RAM is not data at all, but just pointers to where to fetch or store data. The "words" Backus talks about here are the instructions contained in the program.
What happens in a Von Neumann computer is the following cyclus:
1. FETCH (get program instructions from RAM)
2. DECODE (what do the instrucitons mean?)
3. FETCH OPERANDS (get the data to fill in te variables in the instructions)
4. EXECUTE (perform tha actions of the instruction)
5. UPDATE INSTRUCTION POINTER (keep track of which instruction we are executing)
This is what happens at each clock cycle of the processor and it happens mostly serially. Programs and Data alike are stored into RAM and all the traffic to and from RAM goes to the processor. Thus the processor-RAM connection is the bottleneck that contrains the overal speed of a Von Neumann computer.
Remember that there have been quite some developments since Von Neumann proposed his architecture and since Backus criticised it. Things like caches (temporary intermediate storage), for instance, are new.
However, the basic problem remains: computation time is dominated by the amount of time taken to shift data from RAM to the CPU. Obviously this problem increases with the size of RAM. "But more RAM makes my machine _faster_!" You might object. True: RAM is faster than your harddisk, but it is painfully slow nevertheless. Computers are good at all sorts of tasks, especially tasks that humans (or other biological systems information processing systems) are good at. Computers are good at maths and chess, humans are good at catching balls. Guess what kind of computer could calculate the trajectory of a ball flying in a parabolic trajectory at slichtly non-linear speed _in real time_ and having to match that movement with a limb apt to not just hit the ball, but catch it? Think your desktop can do it? No way, think XServe cluster. Computers can't even (yet) reliably decode an image to find the 3D contours of objects in real time to e.g. steer a car. Humans can, because our brains do not have a Von Neumann bottleneck. Forget Mac VS PC benchmarks, computers are painfully slow when compared to biological information processing systems. Real life reaction time: that is where the Bottlececk can be felt.