Hi, I am Shahnawaz

I am a Ph.D. student at the Wallenberg Centre for Quantum Technology at Chalmers University of Technology, Göteborg.

My research interest lies in the intersection of (Q)uantum information and computing and (M)achine (L)earning. I am especially interested in ML methods applied to problems in quantum information - MLQ but will also be working towards developing techniques that may potentially enhance ML using (Q)uantum systems - QML.

Previously, I worked in the the Theoretical Quantum Physics Group of Prof. Franco Nori at Riken, Japan as a master thesis student on numerical approaches to solve problems in Open Quantum Systems.


I have a mix of interests and experiences in

Quantum Information & Computing

Machine Learning

Open Source Scientific Computing

About

– Education –

Chalmers Tekniska Högskola (Göteborg, Sweden)

Doktorand, Applied Quantum Physics, Laboratory, MC2; 2018 - 2022 (expected)

Birla Institute of Technology and Science (Goa, India)

M.Sc.(Hons.) Physics; 2013 - 2018

B.E.(Hons.) Electrical & Electronics; 2013 - 2018

– Experience –

Theoretical Quantum Physics Lab, Riken, Wako, Japan

(International Program Associate, April 2018 - July 2018)

Worked on the development of numerical techniques for simulating open quantum systems for ensembles of qubits and qubits in a bath with strong and ultrastrong coupling. Developed a python package - PIQS in collaboration with Dr. Nathan Shammah as part of the work on:

Open quantum systems with local and collective incoherent processes: Efficient numerical simulations using permutational invariance Phys. Rev. A Nathan Shammah, Shahnawaz Ahmed, Neill Lambert, Simone De Liberato, and Franco Nori Accepted 6 November 2018

Theoretical Quantum Physics Lab, Riken, Wako, Japan

(Bachelor thesis, July 2017 - March 2018)

Bachelor thesis on Deep Learning Constraints - how Deep Neural Networks learn rules and functions from data with the specific case of learning the rules of Sudoku.

Next Generation Computing Lab, Ritsumeikan University

(Intern, Dec 2016 - Jan 2017)

Guide: Prof. Shigeru Yamashita, NGC Lab

Worked on the development of a pipeline for optimization of topological quantum circuits starting from the ICM representation (Paler et al., 2015 ). Developed a code for conversion of quantum circuits to the ICM representation and their visualization. QuTiP PR

Google Summer of Code 2016, Python Software Foundation

(Intern, May 2016 - Aug 2016)

Mentor: Dr. Ariel Rokem, Senior Data Scientist, University of Washington eScience Institute

Developed a python module for Magnetic Resonance Image (MRI) reconstruction based on the Intra-voxel Incoherent Motion model (Le Bihan, 84) which was released as part of Dipy - an open source Python package for computational neuroanatomy.

Quantum Information and Computing Group, HRI, Allahabad

(SRF, July 2016 - Aug 2016)

Guide: Prof. Ujjwal Sen, Associate ProfessorG, HRI, Allahabad

Indian Academy of Sciences (IAS) Summer Research Fellow. Studied Quantum Entanglement, measures of classical and quantum correlations and the application of Bell inequalities in Quantum Cryptography. In particular, analysed the E91 protocol (Arthur Ekert, 91) and use of Bell inequalities in device independent quantum cyrptography.

Computational Biology Group, Institute of Mathematical Sciences, Chennai

(Intern, Dec 2015 - Jan 2016)

Guide: Prof. Sitabhra Sinha, adjunct faculty of the National Institute of Advanced Studies (NIAS), Bangalore.

Simulated a Hodgkin Huxley inspired model for electrical signaling in bacterial bio-films using Python. Tested the ability of bacterial bio-films to behave as excitable media by extending the 1D model in the study (Prindle et al, 2015) to 2D and analysing the result of various initial conditions.

Contact

Shahnawaz Ahmed

Email

shahnawaz.ahmed95@gmail.com

Github

www.github.com/quantshah

LinkedIn

https://www.linkedin.com/in/quantshah/

Resources

1. Machine Learning Phases of matter

(Juan Carrasquilla and Roger G. Melko, Nature Physics, Feb 2017)

This is a very interesting and simple to understand article on how a neural network can “learn” physical laws.

  • Using Neural Networks to classify phase in a simple square lattice Ising model
  • Data generated for high temperature and low temperature cases using Standard Monte Carlo techniques assuming given model
  • Neural network consisting of one hidden layer with three neurons for detecting ↑ ↓ and un-polarised spins can classify the state and get a good value of the critical temperature for cross-over
  • The weights get adjusted to approximately become linear functions of the magnetization. Even a more complicated network of 100 neurons gets clustered into the broad categories making a decission based on polarisation.
  • The supplementary discusses the Aubry-Andre model and why a Convolutional Neural Network has a high discriminative power in classifying Metallic/Anderson localised phases.
  • In the context of the Ising Lattice Guage Theory, the Convolutional layer filters determine whether energetic constraints of individual plaquettes are satisfied or not.

2. Multilayer Feedforward Networks are Universal Approximators

(Kurt Hornik, Mar 1989)

An early paper with over 14,000 citations proving that a Neural Networks can universally approximate any measureable (Borel) function.

  • Rigourously establishes that feedforward networks with just one hidden layer using arbitrary sqaushing functions can approximate any Borel measureable function from one finite dimensional space to another to any degree of accuracy.
  • This is on the condition that sufficient hidden units are available.
  • Try out this cool tool to see this for yourself by making a simple NN in your browser, constructing an arbitrary function by mouse clicking and looking at the Neural Network approximate it - convnetjs
  • While attempting to guess a quadratic, I faced the issue of selecting the optimizer and batch size. Beware of that ! This paper - On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima (Keskar et al., Feb 2017) could be helpful.

3. Learning Polynomials Using Neural Networks

(Andoni et al., Microsoft Research, Jun 2014)

  • Learning low degree polynomial using gradient descent
  • Using complex valued weights, “robust local minamas” can be eliminated.
  • Can sparse polynomials be learnt with small neural networks ?
  • Also check this paper on Numeric law discovery using Neural Networks (Saito and Nakano, NTT Labs, Japan, Aug, 1998)

4. Prediction of dynamical systems by symbolic regression

(Quade et al., Phys. Rev. E 94, 012214, July 2016)

  • Symbolic regression is a Machine Learning technique for model finding. It does not require a specific model to start searching for the structure hidden in data and works by combinining mathematical building blocks using algorithms such as genetic programming.
  • This paper investigates the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming.
  • The methods are illustrated by applying them to a simple harmonic oscillator, FitzHugh-Nagumo oscillators, which are known to produce complex behaviour and short-term and medium-term forecasting of solar power production.