In Europe and America, this is one in five people. And since they are less likely to be in work, their poverty rate is about twice as high.
So technologies that could help disabled people contribute more in the workplace - and improve their quality of life - are surely welcome.
And it also makes good business sense.
If a million more disabled people could work, the UK economy alone would grow 1.7%, or £45bn ($64bn), says disability charity Scope.
The eyes have it
Motor neuron disease affects 400,000 people worldwide, including renowned scientist Professor Stephen Hawking. Multiple sclerosis affects 2.3 million.
But neurons controlling eye movement are more resistant to degenerative diseases. This is also true of other parts of the face, like the cheek, which Prof Hawking uses to communicate.
US company LC Technologies has invented a device that enables people to control a computer using just their eyes.
Eyegaze Edge is the latest invention of the company, which was founded in 1988 by a group of engineers in a basement.
It solved the basic scientific problems then, but the early device was cumbersome and very expensive.
"We crammed it in back of a single-engine plane and took it around to towns where there was a need," says medical director Nancy Cleveland.
"Now, it fits in a suitcase in a commercial aircraft."
The technology behind Eyegaze is called Pupil Centre/Corneal Reflection, or PCCR. A tablet is set up in front of the user, with a small video camera underneath. A near-infrared LED (light-emitting diode) light illuminates the user's eye.
The camera then measures the distance between the centre of your pupil and the reflection of LED light on your cornea - the transparent bit of your eye at the front.
This tiny distance shifts as your gaze changes, and this enables a computer to work out exactly where you're looking.
"People have done all kinds of interesting jobs," says Ms Cleveland, "and all they had was the ability to move their eyes."
She says about 12 books have been written using the device.
A similar device is the HeadMouse Nano, recently developed by Texas-based Origin Instruments.
A camera tracks the movements of a reflective dot stuck to the user's forehead, and these motions control a computer cursor.
Selections are made using a "sip-puff" switch in the mouth, or by dwell time - how long the head stays in a certain position.
It requires slightly more motor ability in its users, but is cheaper.
"Lately, we've reduced size and power consumption," says Origin's vice president Mel Dashner, who worked on tracking devices for aircraft during the Cold War. "We're mainly riding the wave of cell phone technology like everybody else."
There are about 39 million blind people in the world, according to the World Health Organisation. But 90% have at least some level of light perception.
So Stephen Hicks, a neuroscientist at Oxford University, has developed "smart glasses" that accentuate the contrast between light and dark objects.
"We try to represent the world in simple and unambiguous real-time images," he says.
The nearest image is bright, whereas the rest of the field is black, and the the contrast between them is cranked up to maximum.
Mr Hicks started working on the glasses in 2010, with tech firm Epson providing the see-through computer displays.
He has since had additional help from the Royal National Institute for the Blind, and prize money from a Google Impact Challenge award.
The biggest challenge for him has been in keeping the weight down - if the glasses weigh more than 120g (4.2oz) wearers get headaches, he says.
So he has put the battery and processing unit into a handset, connected to the glasses by a small cable.
Technology can even help the 1.5 million people in the world who are deaf and blind. Helen Keller, most famously, was the first deafblind person to earn a bachelor of arts degree in 1904.
Deafblind people can communicate using tactical alphabets - pressing or pinching different parts of the hand represents different letters.
Now Nicholas Caporusso, from Bari in southern Italy, has developed a way of turning these movements and touches into electronic signals via a special glove.
Sensors in his dbGLOVE turn these alphabet tracings into computer text, and actuators trace the letters back onto the hand. This will enable deafblind people to operate computers and smartphones.
Mr Caporusso hopes the final device, which was developed with two partners from Finland - where Nokia has left a legacy of mobile phone inventiveness - will be ready early this year.
"The perfect match of Italian design and Finnish technology," Mr Caporusso calls it.
The biggest challenge was size, he says, as it is with many of these assistive technologies: "All these cables, actuators, and sensors are in a very small space."
Advances in 3D printing and bio-electronics are also helping replace missing limbs with prosthetics and give disabled people extra functionality.
For example, in 2014, Ontario-based Thalmic Labs released an armband called the Myo. It enables a person to control computer devices by reading the electricity produced by their skeletal muscles and then sending these signals wirelessly via Bluetooth to the device.
In December 2015, researchers at Johns Hopkins University in Baltimore adapted this armband to control a prosthetic limb.
Thalmic's chief executive, Stephen Lake, says Myo "slides right on the arm, with no surgery or skin prep, and provides much more reliable signals than you can get with electrodes."
The technology was originally developed to facilitate gesture-controlled presentations and has been used by DJs to control lighting displays.
And if such assistive technology can be used by non-disabled people, too, it can be made more cheaply to the benefit of all.