more fun with AI

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

more fun with AI

Roger Critchlow-2
Okay, this one got published in Science today, https://arxiv.org/abs/1606.02318, they solve an n-body quantum wave function with artificial neural nets, they earned two separate commentary articles:

The challenge posed by the many-body problem in quantum physics originates from the difficulty of describing the non-trivial correlations encoded in the exponential complexity of the many-body wave function. Here we demonstrate that systematic machine learning of the wave function can reduce this complexity to a tractable computational form, for some notable cases of physical interest. We introduce a variational representation of quantum states based on artificial neural networks with variable number of hidden neurons. A reinforcement-learning scheme is then demonstrated, capable of either finding the ground-state or describing the unitary time evolution of complex interacting quantum systems. We show that this approach achieves very high accuracy in the description of equilibrium and dynamical properties of prototypical interacting spins models in both one and two dimensions, thus offering a new powerful tool to solve the quantum many-body problem.
This is getting sort of close to home, now, we're replacing cleverly contrived numerical methods for exotic quantum physics with generic machine learning algorithms.

-- rec --


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: more fun with AI

Russell Standish-2
On Thu, Feb 09, 2017 at 05:20:58PM -0500, Roger Critchlow wrote:
> Okay, this one got published in Science today,
> https://arxiv.org/abs/1606.02318, they solve an n-body quantum wave
> function with artificial neural nets, they earned two separate commentary
> articles:
>

How interesting! I have downloaded this for later perusal. I have long
thought there is some intimate connection between the structure of
brains and the projection operator. If they're able to determine the
ground state from a n-body quantum state efficiently using a
brain-like structure, then this strongly hints at that
connection. Although, I'm sure they don't say so in the article, I
couldn't imagine Science publishing such airy-fairy stuff.


Cheers
--

----------------------------------------------------------------------------
Dr Russell Standish                    Phone 0425 253119 (mobile)
Principal, High Performance Coders
Visiting Senior Research Fellow        [hidden email]
Economics, Kingston University         http://www.hpcoders.com.au
----------------------------------------------------------------------------

============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: more fun with AI

Steve Smith
In reply to this post by Roger Critchlow-2

Very exciting... I'll have to read deeper into this...   I think we are on the verge of another punctuation in our equilibrium (of Sci/Tech advances)...



On 2/9/17 3:20 PM, Roger Critchlow wrote:
Okay, this one got published in Science today, https://arxiv.org/abs/1606.02318, they solve an n-body quantum wave function with artificial neural nets, they earned two separate commentary articles:

The challenge posed by the many-body problem in quantum physics originates from the difficulty of describing the non-trivial correlations encoded in the exponential complexity of the many-body wave function. Here we demonstrate that systematic machine learning of the wave function can reduce this complexity to a tractable computational form, for some notable cases of physical interest. We introduce a variational representation of quantum states based on artificial neural networks with variable number of hidden neurons. A reinforcement-learning scheme is then demonstrated, capable of either finding the ground-state or describing the unitary time evolution of complex interacting quantum systems. We show that this approach achieves very high accuracy in the description of equilibrium and dynamical properties of prototypical interacting spins models in both one and two dimensions, thus offering a new powerful tool to solve the quantum many-body problem.
This is getting sort of close to home, now, we're replacing cleverly contrived numerical methods for exotic quantum physics with generic machine learning algorithms.

-- rec --



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: more fun with AI

Marcus G. Daniels
In reply to this post by Roger Critchlow-2

Roger writes:

 

This is getting sort of close to home, now, we're replacing cleverly contrived numerical methods for exotic quantum physics with generic machine learning algorithms.

 

The compression is a factor of 40 better compared to those algorithms.   The problems aren’t super hard though, I wonder what would happen with a spin glass.

 

Marcus


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: more fun with AI

Roger Critchlow-2
I watched the livestream from the TensorFlow Dev Summit in Mountainview yesterday.  The individual talks are already packaged up as individual videos at https://events.withgoogle.com/tensorflow-dev-summit/videos-and-agenda/#content, but watching the livestream with the enforced moments of deadtime filled with vaguely familiar music (was that Phillip Glass, or a network trained on him?) was very instructive.

TensorFlow is a data graph language where the data is all Tensors, ie vectors, matrices, and higher dimensional globs of numbers.  Google open sourced it as python scripts and a C++ kernel about a year ago, updated with minor releases monthly, and released 1.0 yesterday.   It's been used all over the place at Google, it's the top machine learning repo at github, and its products have made the cover of Nature twice or three times in the past year.

New stuff yesterday:
  • an LLVM compiler to native x86, arm, nvidia, etc., 
  • new language front ends,
  • pre-built networks and network components
  • classical ML techniques in case deep learning networks aren't your thing
  • distributed model training on pcs, servers, and GPUs
  • a server architecture for delivering inferences at defined latency
  • embedded inference stacks for Android, iOS, and Raspberry Pi
  • a very sweet visualizer, TensorBoard, for network architectures, parameters, and classified sets
  • higher level APIs
  • and networks trained to find network architectures for new classes of problems
You can get a lot of this by just watching the keynote, even just the first 10 minutes of the keynote.  

Whether you buy the KoolAid or not, it's an impressive demonstration of the quantity and quality of KoolAid that the Google mind can produce when it decides that it needs KoolAid.

An LSTM is a Long Short-Term Memory node, a basic building block of the networks that translate languages or process other variable length symbol strings.

-- rec --


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
Reply | Threaded
Open this post in threaded view
|

Re: more fun with AI

Steve Smith

holy shite REC!   Looks like pretty good KoolAid!

I cut my teeth 40 years ago on APL.  Feels like what I *wished for* back then (studying Physics/Math with CS "just a tool").

As we talked a few years ago, I have a (still open, hanging fire) project to do real-time stitching on a 360 stereographic camera (84 cameras in a spherical array with more than 50% overlap with each neighbor, E/W and N/S)...  

- Steve

On 2/16/17 8:57 AM, Roger Critchlow wrote:
I watched the livestream from the TensorFlow Dev Summit in Mountainview yesterday.  The individual talks are already packaged up as individual videos at https://events.withgoogle.com/tensorflow-dev-summit/videos-and-agenda/#content, but watching the livestream with the enforced moments of deadtime filled with vaguely familiar music (was that Phillip Glass, or a network trained on him?) was very instructive.

TensorFlow is a data graph language where the data is all Tensors, ie vectors, matrices, and higher dimensional globs of numbers.  Google open sourced it as python scripts and a C++ kernel about a year ago, updated with minor releases monthly, and released 1.0 yesterday.   It's been used all over the place at Google, it's the top machine learning repo at github, and its products have made the cover of Nature twice or three times in the past year.

New stuff yesterday:
  • an LLVM compiler to native x86, arm, nvidia, etc., 
  • new language front ends,
  • pre-built networks and network components
  • classical ML techniques in case deep learning networks aren't your thing
  • distributed model training on pcs, servers, and GPUs
  • a server architecture for delivering inferences at defined latency
  • embedded inference stacks for Android, iOS, and Raspberry Pi
  • a very sweet visualizer, TensorBoard, for network architectures, parameters, and classified sets
  • higher level APIs
  • and networks trained to find network architectures for new classes of problems
You can get a lot of this by just watching the keynote, even just the first 10 minutes of the keynote.  

Whether you buy the KoolAid or not, it's an impressive demonstration of the quantity and quality of KoolAid that the Google mind can produce when it decides that it needs KoolAid.

An LSTM is a Long Short-Term Memory node, a basic building block of the networks that translate languages or process other variable length symbol strings.

-- rec --



============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove


============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove