julia-logo

A Fresh New Approach to Numerical Computing

About Me

Roger Luo Research Assistant, IOP

http://rogerluo.me

Interests Quantum Information
Machine Learning
Expertise Hu You

You can find all related material at

http://num.v2nobel.com/talks/9/

I highly recommend you read the documents here:

http://rogerluo.me/TutorialForPhysicists.jl/

And install this tutorial in your Julia by

julia> Pkg.clone("https://github.com/Roger-luo/TutorialForPhysicists.jl.git")

Benchmark

Tensor Contraction

Python (3.5 on IPython 6.3.1) 326 ms (C implementation in numpy)

import numpy as np

def propagate(LHS, X, Y):
    P = np.einsum('ijk, ipq', LHS, X)
    Q = np.einsum('jkqp, jvrq', P, Y)
    R = np.einsum('kprv, kvm', Q, X)
    return R
LHS = np.random.randn(200, 10, 200)
X = np.random.randn(200, 2, 200)
Y = np.random.randn(10, 2, 10, 2)

%timeit propagate(LHS, X, Y)

Python (3.5 on IPython 6.3.1) 45.1 ms (numpy.tensordot in numpy)

def propagate(LHS, X, Y):
    P = np.tensordot(LHS, X, axes=([0, ], [0, ]))
    Q = np.tensordot(P, Y, axes=([0, 2], [0, 1]))
    R = np.tensordot(Q, X, axes=([0, 3], [0, 1]))
    return R
LHS = np.random.randn(200, 10, 200)
X = np.random.randn(200, 2, 200)
Y = np.random.randn(10, 2, 10, 2)

%timeit propagate(LHS, X, Y)

Julia (0.6 with OpenBLAS) 24.593 ms (pure Julia implementation)

using TensorOperations
using BenchmarkTools

function propagate(LHS, X, Y)
    @tensor R[6,7,8] := LHS[1,2,3]*X[1,4,6]*Y[2,5,7,4]*X[3,5,8]
end

BLAS.set_num_threads(1)
LHS = randn(200, 10, 200); X = randn(200, 2, 200); Y = randn(10, 2, 10, 2);

@benchmark propagate(LHS, X, Y)
Plot 9

From IBM community: link

A Comparison of C Julia Numba and Cython on LU Factorization

numba

cython

numpy

scipy

julia

julia-cython

Comparition to QuTIP

benchmark

A Comparision between Differential Equation Solvers

compare

Why use Julia wrapper?

Why use the Julia Tensorflow

Some Selected Features

  • Multiple Dispatch
  • Dynamic Type System
  • Good Performance, approaching statically-compiled languages like C
  • Lisp-like macros and other metaprogramming facilities
  • Call Python functions: use the PyCall package
  • Call C functions directly: no wrappers or special APIs
  • Designed for parallelism and distributed computation
  • Automatic generation of efficient, specialized code for different argument types
  • elegant and extensible conversions and promotions for numeric and other types
  • Efficient support for Unicode, including but not limited to UTF-8
  • MIT licensed: free and open source

Let's learn it from examples

REPL

Julia has an awesome REPL, just type julia in your terminal

shell> julia
               _
   _       _ _(_)_     |  A fresh approach to technical computing
  (_)     | (_) (_)    |  Documentation: https://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.6.3-pre.0 (2017-12-18 07:11 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 93168a6 (106 days old release-0.6)
|__/                   |  x86_64-linux-gnu

julia>

CLI Compiler

Like GCC, Python and other compilers, you can run julia scripts by its CLI interface

shell> julia script.jl

Demo: Hello World

println("hello world!")
shell> julia hello.jl
hello world!

Builtin Numerical Types

  • Int8 to Int128
  • Float16 to Float64
  • BigInt and BigFloat (Arbitrary Precision)

Demo: Sum up two numbers

Julia has abundant builtin types, including:

function mysum(a::Int, b)
    return a + b
end
julia> @code_typed mysum(1, 2)
CodeInfo(:(begin 
        return (Base.add_int)(a, b)::Int64
    end))=>Int64

Type System

struct Complex128
    real::Float64
    imag::Float64
end
julia> Complex128(1, 2)
struct Complex{T}
    real::T
    imag::T
end
julia> Complex{Float64}(1.0, 2.0)
Complex{Float64}(1.0, 2.0)

But actually julia can infer the parameters itself if possible

julia> Complex(1.0, 2.0)
Complex{Float64}(1.0, 2.0)

This does not make sense for numerics:

Complex{String}("a", "b")
struct Complex{T <: Number} <: Number
    real::T
    imag::T
end

The operatorr <: means subtype, you can also use it to test the relation between types.

julia> Complex <: Number
true
julia> Complex{String}
ERROR: TypeError: Complex: in T, expected T<:Number, got Type{String}

Multiple Dispatch

and Julia's

Abstractions

Abstract Types

We could define some abstract type hierarchies.

# The zoo society
abstract type Animal end
abstract type AbstractBird <: Animal end
abstract type AbstractCat <: Animal end
# All subtype of Animal should have a name
name(animal::Animal) = animal.name
gender(animal::Animal) = "unknown"

# We can know about each animal if it can fly
# By requiring this function (or property)
# Most animals won't fly
flyable(animal::Animal) = false
# Most birds can
flyable(bird::AbstractBird) = true
# Kitty is a special Cat
struct Kitty <: AbstractCat
    weight::Float64
end

define a constructor with default values

# Kitty is 0.3kg when she was born
Kitty() = Kitty(0.3)
struct KittyKitty <: Kitty
end

The ability of dispatching functions according to runtime type signatures is called multiple dispatch.

# comment this method to see what will happen
# Kitty's name
name(me::Kitty) = "Kitty"
# Kitty's gender
gender(me::Kitty) = "female"
# play is a kind of relationship inside the society of animals
# Most animal won't play with others
play(a::Animal, b::Animal) = "$(name(a)) won't play with $(name(b))"
play(a::Animal, b::Animal, c::Animal) = "$(name(a)), $(name(b)), $(name(c)) won't play together"

# Kohen will play with Kitty
# comment this to see what will happen
play(bird::Kohen, cat::Kitty) = "$(name(bird)) will play with $(name(cat))"
play(cat::Kitty, bird::Kohen) = "$(name(cat)) won't play with $(name(bird))"

Parametric Types

# Bear can have different colors
abstract type AbstractBearColor end
abstract type Green <: AbstractBearColor end
abstract type White <: AbstractBearColor end
abstract type Brown <: AbstractBearColor end
abstract type AbstractBear{T <: AbstractBearColor} <: Animal end
# we could define some general interface to All KINDS OF bears
# and use multiple dispatch to specify methods to certain
# type with certain type parameters (signature)
# the word where declares type parameters
color(::Type{AbstractBear{T}}) where T = "We don't have $T Bear"
# you can specify type parameters' parent type if you want
# Union will create a type union, behaves like an abstract type
# in type inference, but it cannot be subtyped
color(::Type{AbstractBear{T}}) where {T <: Union{Brown, White}} = T
color(bear::AbstractBear) where T = color(typeof(bear))
# Abram is a Brown bear
struct Abram <: AbstractBear{Brown}
end

name(bear::Abram) = "Abram"

# Dima is a White bear
struct Dima <: AbstractBear{White}
end

name(bear::Dima) = "Dima"

play(a::Abram, b::Dima, c::Kohen) = "Abram, Dima and Kohen often play together"

Make your hands dirty

What is the color of Dima?

julia> color(Dima())
How to fix this?
julia> Complex(1.0)
ERROR: LoadError: MethodError: Cannot `convert` an object of type Float64 to an object of type Complex

Try more about multiple dispatch with zoo.jl script.

shell> julia zoo.jl

Mutable Types

and

Immutable Types

Two essential properties of immutability in Julia:

  • An object with an immutable type is passed around (both in assignment statements and in function calls) by copying, whereas a mutable type is passed around by reference.
  • It is not permitted to modify the fields of a composite immutable type.
# name can be changed
mutable struct MutablePerson
    name::String
end

# name cannot be changed
struct Person
    name::String
end

Inner Constructor

struct Complex{T <: Number} <: Number
    real::T
    imag::T

    function Complex(real::T, imag::T) where T
        new(real, imag)
    end
end
julia> struct Even
           e::Int
       end

julia> Even(2)
Even(2)

And to reject old numbers

Even(x) = iseven(x) ? Even(x) : throw(ArgumentError("x=$x is odd))

This won't work, because it has the signature

Even(x::Any)

But the default constructor has

Even(x::Int)

Julia will call the default constructor first because of multiple dispatch.

We will use the inner constructor instead

julia> struct Even
          e::Int
          Even(e::Int) = iseven(e) ? new(e) : throw(ArgumentError("e=$e is odd"))
       end

Functional Programming in Julia

Pure Function

Pure functions are functions that do not have side-effects and impure functions may change the global state.

x = 2
# impure
function f()
    x = 3
end

# pure
f(x) = x
# immutable type by default
struct Foo
    name::String
end

foo = Foo("Me")
foo.name = "You" # this will cause an error
function foo()
    foo()
end

This will cause stack overflow.

This one is functional and is used for Julia's parser at the moment.

femtolisp

  • Laziness
  • Cacheable results
  • No order dependencies
  • Easy to parallelize

deadlock

  • Arrays or Vectors, because they have amortised constant time O(1) insert at the end and have constant time lookup, make it mutable would be a better choice
  • Hashmaps or Hashset similar
map(sin, rand(100, 100)) # sin is a pure function

equivalent to

sin.(rand(100, 100)) # similar with MATLAB
# get 3
begin
    3
end
function foo()
    3
end

Metaprogramming

Program Representation

Every Julia program starts life as a string:

julia> prog = "1 + 1"
"1 + 1"

next it will be parsed to an Julia type Expr:

julia> ex1 = parse(prog)
:(1 + 1)

julia> typeof(ex1)
Expr

Expr objects contain two parts:

  • a Symbol identifying the kind of expression. A symbol is an interned string identifier.
julia> ex1.head
:call
  • the expression arguments, which may be symbols, other expressions, or literal values:
julia> ex1.args
3-element Array{Any,1}:
  :+
 1
 1

The key point here is that Julia code is internally represented as a data structure that is accessible from the language itself.

Quoting

julia> ex = :(a+b*c+1)
:(a + b * c + 1)
ex = quote
    x = 1
    y = 2
    x + y
end

Interpolation

julia> a = 1;

julia> ex = :($a + b)
:(1 + b)

Evaluation

julia> a = 1; b = 2;

julia> eval(:(a + b))
3

Macros

A brief structure of Julia's execution process

prog |> readstring |> parse |> macros |> eval 
macro showexpr(expr)
    println(expr)
end

Generated Functions

calculate a number from its trinary representation

@generated function tri2int(x::NTuple{N, Int}) where N
    # you can only access x as a type inside
    ex = :(x[1])
    for i = 2:N
        base = 3^(i-1)
        ex = :($ex + x[$i] * $base)
    end
    return ex
end

Why not Python, Numba, PyPy but Julia?

Julia's Performance is not magic, the language design is

class Foo(object):

    STATIC_MEMBER

    def method1(self):
        pass

    def method2(self):
        pass
In [1]: foo = Foo()

In [2]: foo.a = 2

In [3]: foo.a
Out[3]: 2

Julia itself is even C compatible (more like AOT language than Python)

Base.@ccallable function hello(n::Int)::Int
    for i = 1:n
        println("hello world")
    end
    return 0
end

Julia's metaprogramming is much more stronger than most of the popular languages in numerical computing community, e.g Python, C/C++ and etc.

iTensor tensor contraction:

Index a("a",2), 
      b("b",2), 
      c("c",2);
ITensor Z(a,b), 
        X(c,b);

Z.set(a(1),b(1),+1.0);
Z.set(a(2),b(2),-1.0);

X.set(b(1),c(2),+1.0);
X.set(b(2),c(1),+1.0);

//the * operator finds and
//contracts common index 'b'
//regardless of index order:

ITensor R = Z * X;

Print( R.real(a(2),c(1)) ); 
//output: R.real(a(1),c(2)) = -1

taco tensor contraction

// Create formats
Format csr({Dense,Sparse});
Format csf({Sparse,Sparse,Sparse});
Format  sv({Sparse});

// Create tensors
Tensor<double> A({2,3},   csr);
Tensor<double> B({2,3,4}, csf);
Tensor<double> c({4},     sv);

// Insert data into B and c
B.insert({0,0,0}, 1.0);
B.insert({1,2,0}, 2.0);
B.insert({1,3,1}, 3.0);
c.insert({0}, 4.0);
c.insert({1}, 5.0);

// Pack inserted data as described by the formats
B.pack();
c.pack();

// Form a tensor-vector multiplication expression
IndexVar i, j, k;
A(i,j) = B(i,j,k) * c(k);

// Compile the expression
A.compile();

// Assemble A's indices and numerically compute the result
A.assemble();
A.compute();

A brief look at the TensorOperations.jl tensor contraction

using TensorOperations
alpha=randn()
A=randn(5,5,5,5,5,5)
B=randn(5,5,5)
C=randn(5,5,5)
D=zeros(5,5,5)
@tensor begin
    D[a,b,c] = A[a,e,f,c,f,g]*B[g,b,e] + alpha*C[c,a,b]
    E[a,b,c] := A[a,e,f,c,f,g]*B[g,b,e] + alpha*C[c,a,b]
end

Let's write a Kronecker Expression Parser

Julia is the only dynamical language that can do GPGPU natively

Effective Extensible Programming: Unleashing Julia on GPUs: arxiv 11712.03112

define CuArray like CPU

xs = cu(rand(5, 5))
ys = cu[1, 2, 3]
xs_cpu = collect(xs)

Broadcasting Operations

Exactly the same with CPU

zs .= xs.^2 .+ ys .* 2

Write Your Own CUDA kernel

using CuArrays, CUDAnative

xs, ys, zs = CuArray(rand(1024)), CuArray(rand(1024)), CuArray(zeros(1024))

function kernel_vadd(out, a, b)
  i = (blockIdx().x-1) * blockDim().x + threadIdx().x
  out[i] = a[i] + b[i]
  return
end

@cuda (1, length(xs)) kernel_vadd(zs, xs, ys)

@assert zs == xs + ys

More details here: Generic GPU Kernels

Performance Tips

The more you tell the compiler, the faster your program could be

use BenchmarkTools to measure your performance (similar to python's timeit)

julia> Pkg.add("BenchmarkTools")

julia> @benchmark rand(1000, 1000)
  • Avoid global variables

  • Provide as much information as possible in compile time.

e.g StaticArray is faster than native Array for small size arrays.

  • Avoid containers with abstract type parameters
a = Real[] # should use Float64[] here
if (f = rand()) < .8
    push!(a, f)
end

It would be hard to know the exactly type information for a type of Any

struct MyAmbiguousType
    a::AbstractFloat
end

this would be better

struct MyAmbiguousType{T <: AbstractFloat}
    a::T
end

Annotate values taken from untyped locations

function foo(a::Array{Any,1})
    x = a[1]::Int32
    b = x+1
    ...
end

More about performance tips, please ref the official documentation.

What we are working on?

The Quantum Many-body Toolkit

Name is still not decided yet, better suggestion is welcome

Motivation

  • Tiny Numerical Tools like the kronecker expression parser could be too fragmented for a single package, but we can collect them together with unified interface.
  • Some Python package is not efficient enough but requires a long time for developing its C++ backend
  • Some C/C++ library is efficient but lack of a user friendly interface to help researcher focus on their research
  • Most Physicists do not have professional experience for writing high performance computing program with C/C++. But in quantum physics, especially quantum many-body physics, both program efficiency and development efficiency are crucial.

What Julia offers us:

Enough Efficiency It can approach C/C++ implementation on speed in simple situation and may offer much better

You can acheive High abstractions with low overhead with metaprogramming by code generation in Julia.

More user-friendly interface can be implemented through Julia's metaprogramming

Awesome Foreign Function Interface (FFI)

Call the function clock from libc.so

t = ccall((:clock, "libc"), Int32, ())

Create a C function with Julia

function mycompare(a::T, b::T) where T
    return convert(Cint, a < b ? -1 : a > b ? +1 : 0)::Cint
end
cfunction(mycompare, Cint, (Ref{Cdouble}, Ref{Cdouble}))

Julia cares about science. It is approaching 1.0 within 10 issues.

Star our package's documentation at rogerluo.me/QMTK.jl