Run with a single bit?

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Run with a single bit?

Tom Johnson
Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
505.577.6482(c)                                    505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Dean Gerber
All or nothing

Sent from my iPhone

On Jul 2, 2017, at 2:42 PM, Tom Johnson <[hidden email]> wrote:

Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
505.577.6482(c)                                    505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Roger Frye-4
In reply to this post by Tom Johnson
It's the width of the channel.  They had been reducing the precision of the floating point numbers being transmitted between nodes from 64 bits to 16 and then to 8 bit bytes.  Nothing to prevent you from reducing to a more complicated network of on/off channels.

Reminds me of my early days in computing when I wanted to boot one computer from another.  I keyed in a one-bit loader in the empty computer.  Then ran a single wire from a console light on the live computer to an input switch on the empty one and transmitted the code for a more flexible loader over the wire.
-Roger


On Sat, Jul 1, 2017 at 11:42 PM, Tom Johnson <[hidden email]> wrote:
Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
<a href="tel:(505)%20577-6482" value="+15055776482" target="_blank">505.577.6482(c)                                    <a href="tel:(505)%20473-9646" value="+15054739646" target="_blank">505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Barry MacKichan
In reply to this post by Tom Johnson

“How, or what, can you do with a "single bit."?“


Start a discussion on Friam.

--Barry


On 1 Jul 2017, at 23:42, Tom Johnson wrote:

Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
505.577.6482(c)                                    505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Roger Frye-4
The Turing machine worked on tapes of bits.

On Sun, Jul 2, 2017 at 10:16 AM, Barry MacKichan <[hidden email]> wrote:

“How, or what, can you do with a "single bit."?“


Start a discussion on Friam.

--Barry


On 1 Jul 2017, at 23:42, Tom Johnson wrote:

Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
<a href="tel:(505)%20577-6482" value="+15055776482" target="_blank">505.577.6482(c)                                    <a href="tel:(505)%20473-9646" value="+15054739646" target="_blank">505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Stephen Guerin-5
In reply to this post by Tom Johnson
Looking into Ofer Dekel's work, the journalist messed up this sentence:

compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit


The team is not compressing neural networks to one-bit. They are compressing the weights used in a neural network from 32-bits to one-bit.  

Weights (or sometimes called the biases) are proxies for synapses in biological neural networks. The weights in a neural network are the activation strengths (or inhibition if negative) on each incoming link to a node which are multiplied by the outgoing signal strength of the uplink neighbor. As a number, it can be expressed in 32-bits or as low as 1 bit. Though at 1-bit weight would not allow for inhibition.The weights are what get tuned during the machine learning. One can also explore the topology of the neural network (which nodes are connected to which) during learning and is the basis for the new craze around Deep Learning. This  technique has been around since the 90s but has now realized its use with the availability of data. Eg, here's a paper of mine from '99 implementing some of research from UT Austin at the time. 

I think it would have been more clear for the journalist to write :

Ofer Dekel's team is researching methods to reduce the memory and processing requirements of Neural Networks to run on smaller devices on the edge of the network closer to the sensors. One method is to reduce the number of bits necessary describe the weights between nodes in a neural network down from 32-bits to as little as one-bit.

-S
_______________________________________________________________________
[hidden email]
CEO, Simtable  http://www.simtable.com
1600 Lena St #D1, Santa Fe, NM 87505
office: <a href="tel:(505)%20995-0206" value="+15059950206" target="_blank">(505)995-0206 mobile: <a href="tel:(505)%20577-5828" value="+15055775828" target="_blank">(505)577-5828
twitter: @simtable

On Sat, Jul 1, 2017 at 11:42 PM, Tom Johnson <[hidden email]> wrote:
Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
<a href="tel:(505)%20577-6482" value="+15055776482" target="_blank">505.577.6482(c)                                    <a href="tel:(505)%20473-9646" value="+15054739646" target="_blank">505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: Run with a single bit?

Tom Johnson
Many thanks for the insightful explanation, Steve. 
T

===================================
Tom Johnson - Inst. for Analytic Journalism
Santa Fe, NM
[hidden email]               505-473-9646
===================================

On Jul 2, 2017 12:33 PM, "Stephen Guerin" <[hidden email]> wrote:
Looking into Ofer Dekel's work, the journalist messed up this sentence:

compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit


The team is not compressing neural networks to one-bit. They are compressing the weights used in a neural network from 32-bits to one-bit.  

Weights (or sometimes called the biases) are proxies for synapses in biological neural networks. The weights in a neural network are the activation strengths (or inhibition if negative) on each incoming link to a node which are multiplied by the outgoing signal strength of the uplink neighbor. As a number, it can be expressed in 32-bits or as low as 1 bit. Though at 1-bit weight would not allow for inhibition.The weights are what get tuned during the machine learning. One can also explore the topology of the neural network (which nodes are connected to which) during learning and is the basis for the new craze around Deep Learning. This  technique has been around since the 90s but has now realized its use with the availability of data. Eg, here's a paper of mine from '99 implementing some of research from UT Austin at the time. 

I think it would have been more clear for the journalist to write :

Ofer Dekel's team is researching methods to reduce the memory and processing requirements of Neural Networks to run on smaller devices on the edge of the network closer to the sensors. One method is to reduce the number of bits necessary describe the weights between nodes in a neural network down from 32-bits to as little as one-bit.

-S
_______________________________________________________________________
[hidden email]
CEO, Simtable  http://www.simtable.com
1600 Lena St #D1, Santa Fe, NM 87505
office: <a href="tel:(505)%20995-0206" value="+15059950206" target="_blank">(505)995-0206 mobile: <a href="tel:(505)%20577-5828" value="+15055775828" target="_blank">(505)577-5828
twitter: @simtable

On Sat, Jul 1, 2017 at 11:42 PM, Tom Johnson <[hidden email]> wrote:
Friam Friends:

A recent article passed along by George Duncan says:

"Now, Varma's team in India and Microsoft researchers in Redmond, Washington, (the entire project is led by lead researcher Ofer Dekel) have figured out how to compress neural networks, the synapses of Machine Learning, down from 32 bits to, sometimes, a single bit and run them on a $10 Raspberry Pi, a low-powered, credit-card-sized computer with a handful of ports and no screen."

How, or what, can you do with a "single bit."?

TJ

============================================
Tom Johnson
Institute for Analytic Journalism   --     Santa Fe, NM USA
<a href="tel:(505)%20577-6482" value="+15055776482" target="_blank">505.577.6482(c)                                    <a href="tel:(505)%20473-9646" value="+15054739646" target="_blank">505.473.9646(h)
Society of Professional Journalists 
Check out It's The People's Data
http://www.jtjohnson.com                   [hidden email]
============================================

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove