Julia (programming language)

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Julia
Julia prog language.svg
Official Julia logo
Paradigm Multi-paradigm: multiple dispatch ("object-oriented"), procedural, functional, meta, multistaged[1]
Designed by Jeff Bezanson, Stefan Karpinski, Viral B. Shah, Alan Edelman
Developer Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and other contributors[2][3]
First appeared 2012; 5 years ago (2012)[4]
Stable release
0.6.0[5] / 19 June 2017; 36 days ago (2017-06-19)[6]
0.5.2 / 8 May 2017; 2 months ago (2017-05-08)[7][8]
Preview release
0.7.0-DEV / daily updates
Typing discipline Dynamic, nominative, parametric
Implementation language Julia, C, Scheme (the parser; using the FemtoLisp implementation), assembly and dependencies (i.e. LLVM) in C++; standard library: Julia (mostly), C (a few dependencies), Fortran (for BLAS)[9]
Platform IA-32, x86-64
OS Linux, macOS, Windows and community support for FreeBSD
License MIT (core),[2] GPL v2;[9][10] a make-file option omits GPL libraries[11]
Filename extensions .jl
Website JuliaLang.org
Influenced by

Julia is a high-level dynamic programming language designed to address the needs of high-performance numerical analysis and computational science, without the typical need of separate compilation to be fast, while also being effective for general-purpose programming,[15][16][17][18] web use[19][20] or as a specification language.[21]

Distinctive aspects of Julia's design include a type system with parametric polymorphism and types in a fully dynamic programming language and multiple dispatch as its core programming paradigm. It allows concurrent, parallel and distributed computing, and direct calling of C and Fortran libraries without glue code.

Julia is garbage-collected,[22] uses eager evaluation and includes efficient libraries for floating-point calculations, linear algebra, random number generation, fast Fourier transforms and regular expression matching.

History[edit]

Work on Julia was started in 2009 by Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and Alan Edelman who set out to create a language that was both high-level and fast. On Valentine’s Day 2012 the team launched[23] a website with a blog post explaining the language's mission. Since then, the Julia community has grown, with an estimated 250,000 users as of 2017.[citation needed] It has attracted some high-profile clients, from investment manager BlackRock, which uses it for time-series analytics, to the British insurer Aviva, which uses it for risk calculations. In 2015, the Federal Reserve Bank of New York used Julia to make models of the US economy, noting that the language made model estimation “about 10 times faster” than before (previously used MATLAB). Julia's co-founders established Julia Computing in 2015 to provide paid support, training, and consulting services to clients, though Julia itself remains free to use. At the 2017 JuliaCon[24] conference, Jeff Reiger, Keno Fischer and others announced[25] that the Celeste project[26] used Julia to achieve over 1 petaflop of compute power when running on the Cori supercomputer (the 5th fastest in the world at the time; 6th fastest as of June 2017). Julia thus joins C, C++, and Fortran as high-level languages in which petaflop computations have been written.

Language features[edit]

According to the official website, the main features of the language are:

  • Multiple dispatch: providing ability to define function behavior across many combinations of argument types
  • Dynamic type system: types for documentation, optimization, and dispatch
  • Good performance, approaching that of statically-typed languages like C
  • A built-in package manager
  • Lisp-like macros and other metaprogramming facilities
  • Call Python functions: use the PyCall package[a]
  • Call C functions directly: no wrappers or special APIs
  • Powerful shell-like abilities to manage other processes
  • Designed for parallel and distributed computing
  • Coroutines: lightweight green threading
  • User-defined types are as fast and compact as built-ins
  • Automatic generation of efficient, specialized code for different argument types
  • Elegant and extensible conversions and promotions for numeric and other types
  • Efficient support for Unicode, including but not limited to UTF-8

Multiple dispatch (also termed multimethods in Lisp) is a generalization of single dispatch – the polymorphic mechanism used in common object-oriented programming (OOP) languages – that uses inheritance. In Julia, all concrete types are subtypes of abstract types, directly or indirectly subtypes of the Any type, which is the top of the type hierarchy. Concrete types can not be subtyped, but composition is used over inheritance, that is used by traditional object-oriented languages (see also inheritance vs subtyping).

Julia draws significant inspiration from various dialects of Lisp, including Scheme and Common Lisp, and it shares many features with Dylan (such as an ALGOL-like free-form infix syntax rather than a Lisp-like prefix syntax, while in Julia "everything"[31] is an expression) – also a multiple-dispatch-oriented dynamic language – and Fortress, another numerical programming language with multiple dispatch and a sophisticated parametric type system. While Common Lisp Object System (CLOS) adds multiple dispatch to Common Lisp, not all functions are generic functions.

In Julia, Dylan and Fortress extensibility is the default, and the system's built-in functions are all generic and extensible. In Dylan, multiple dispatch is as fundamental as it is in Julia: all user-defined functions and even basic built-in operations like + are generic. Dylan's type system, however, does not fully support parametric types, which are more typical of the ML lineage of languages. By default, CLOS does not allow for dispatch on Common Lisp's parametric types; such extended dispatch semantics can only be added as an extension through the CLOS Metaobject Protocol. By convergent design, Fortress also features multiple dispatch on parametric types; unlike Julia, however, Fortress is statically rather than dynamically typed, with separate compiling and executing phases. The language features are summarized in the following table:

Language Type system Generic functions Parametric types
Julia Dynamic Default Yes
Common Lisp Dynamic Opt-in Yes (but no dispatch)
Dylan Dynamic Default Partial (no dispatch)
Fortress Static Default Yes

By default, the Julia runtime must be pre-installed as user-provided source code is run, while another way is possible, where a standalone executable can be made that needs no Julia source code built with BuildExecutable.jl.[32][33]

Julia's syntactic macros (used for metaprogramming), like Lisp macros, are more powerful and different from text-substitution macros used in the preprocessor of some other languages such as C, because they work at the level of abstract syntax trees (ASTs). Julia's macro system is hygienic, but also supports deliberate capture when desired (like for anaphoric macros) using the esc construct.

Interaction[edit]

The Julia official distribution includes an interactive session shell, called Julia's read–eval–print loop (REPL), which can be used to experiment and test code quickly.[34] The following fragment represents a sample session example where strings are concatenated automatically by println:[35]

julia> p(x) = 2x^2 + 1; f(x, y) = 1 + 2p(x)y
julia> println("Hello world!", " I'm on cloud ", f(0, 4), " as Julia supports recognizable syntax!")
Hello world! I'm on cloud 9 as Julia supports recognizable syntax!

The REPL gives user access to the system shell and to help mode, by pressing ; or ? after the prompt (preceding each command), respectively. It also keeps the history of commands, including between sessions.[36] Code that can be tested inside the Julia's interactive section or saved into a file with a .jl extension and run from the command line by typing:[31]

$ julia <filename>

Julia is supported by Jupyter, an online interactive "notebooks" environment.[37]

Use with other languages[edit]

Julia's ccall keyword is used to call C-exported or Fortran shared library functions individually.

Julia has Unicode 9.0 support, with UTF-8 used for source code (and by default for strings) and e.g. optionally allowing common math symbols for many operators, such as ∈ for the in operator.

Julia has packages supporting markup languages such as HTML, (and also for HTTP), XML, JSON and BSON; and for database and web use in general.

Implementation[edit]

Julia's core is implemented in Julia, C (and the LLVM dependency is in C++), assembly and its parser in Scheme ("femtolisp"). The LLVM compiler infrastructure project is used as the back end for generation of 64-bit or 32-bit optimized machine code (i.e. not for VM[38]) depending on the platform Julia runs on. With some exceptions (e.g., PCRE), the standard library is implemented in Julia itself. The most notable aspect of Julia's implementation is its speed, which is often within a factor of two relative to fully optimized C code (and thus often an order of magnitude faster than Python or R).[39] Development of Julia began in 2009 and an open-source version was publicized in February 2012.[4][40]

Julia 0.6 "is now considered the stable line of releases and is recommended for most users, as it provides both language and API stability"[41] and is on a monthly release schedule where bugs are fixed and some new features from 0.7-DEV are backported. In contrast the 0.5 release line is only provided with critical bug fixes and older lines are no longer maintained.

Current and future platforms[edit]

While Julia uses JIT[42] (MCJIT[43] from LLVM) – it still means Julia generates native machine code, directly, before a function is first run (not a bytecode that is run on a virtual machine (VM) or translated as the bytecode is running, as with e.g., Java; the JVM or Dalvik in Android).

Current support is for 32- and 64-bit (all except for ancient pre-Pentium 4-era, to optimize for newer) x86 processors (and with download of executables or source code also available for other architectures). Support for ARM, AArch64, and POWER8 (little-endian) is available too.[44] "Nightly builds are available for ARMv7-A. [..] Note that OpenBLAS only supports ARMv7. For older ARM variants, using the reference BLAS may be the simplest thing to do. [..] Note: These [Raspberry Pi] chips use ARMv6, which is not well supported at the moment. However it is possible to get a working Julia build. [e.g. supported] nVidia Jetson TX2 [with] CUDA functionality"[45] The Raspberry Pi support also includes limited support for Raspberry Pi 1 (since it has ARMv6),[46][47] is now supported in Raspbian[48] while support is better for newer (e.g.) ARMv7 Pis; the Julia support is promoted by the Raspberry Pi Foundation.[49]

Julia supports 64-bit ARM and PowerPC and "fully supports ARMv8 (AArch64) processors, and supports ARMv7 and ARMv6 (AArch32) with some caveats"[50][51] and PowerPC being worked on, with almost no open specific issues,[52][53] with binaries available for POWER7 ("due to some small support from IBM") and POWER8, that are expected to have official beta support as of 0.5 (at least for non-parallel support).[54]

Support for GNU Hurd is being worked on (in JuliaLang's openlibm dependency project).[55]

Julia version 1.0 is planned for 2017 and some features are discussed for version 2+ that is also planned, e.g., "multiple inheritance for abstract types".[56] Current 0.7.0-DEV version will, after release, be renamed to 1.0 (with tiny changes; just because of deprecations), scheduled for 2017.

Julia2C source-to-source compiler[edit]

A Julia2C source-to-source compiler from Intel Labs is available.[57] This source-to-source compiler is a fork of Julia, that emits C code (and makes the full Julia implementation not needed, for that generated C code) instead of native machine code, for functions or whole programs; this makes Julia effectively much more portable, as C is very portable with compilers available for most CPUs. The compiler is also meant to allow analyzing code at a higher level than C.[58]

Intel's ParallelAccelerator.jl[59] can be thought of as a partial Julia to C++ compiler (and then to machine code transparently), but the objective is parallel speedup (can be "100x over plain Julia", for the older 0.4 version,[60] and could in cases also speed up serial code manyfold for that version); not compiling the full Julia language to C++ (C++ is only an implementation detail, later versions might not compile to C++). It doesn't need to compile all of Julia's syntax, as the rest is handled by Julia.

Julia Computing company[edit]

Jeff Bezanson founded the consulting company Julia Computing with his fellow Julia creators: his doctoral advisor Alan Edelman, Stefan Karpinski and Viral B. Shah.

See also[edit]

Notes[edit]

  1. ^ Calling newer Python 3 also works[27][28] (and PyPy[29]) and calling in the other direction, from Python to Julia, is also supported with pyjulia.[30] Even calling recursively (back and forth) between these languages is possible, without (or with) using Polyglot.jl[29], that supports additional languages to Python.

References[edit]

  1. ^ "Smoothing data with Julia’s @generated functions". 5 November 2015. Retrieved 9 December 2015. Julia’s generated functions are closely related to the multistaged programming (MSP) paradigm popularized by Taha and Sheard, which generalizes the compile time/run time stages of program execution by allowing for multiple stages of delayed code execution. 
  2. ^ a b "LICENSE.md". GitHub. 
  3. ^ "Contributors to JuliaLang/julia". GitHub. 
  4. ^ a b c d e f "Why We Created Julia". Julia website. February 2012. Retrieved 7 February 2013. 
  5. ^ "Julia Downloads". JuliaLang.org. Retrieved 2017-06-20. 
  6. ^ "v0.6.0". Github.com. 2017-06-19. Retrieved 2017-06-20. 
  7. ^ "v0.5.2". Github.com. 2017-05-08. 
  8. ^ https://github.com/JuliaLang/julia/commit/f4c6c9d4bbbd9587d84494e314f692c15ff1f9c0
  9. ^ a b "Julia". Julia. NumFocus project. Retrieved 9 December 2016. Julia’s Base library, largely written in Julia itself, also integrates mature, best-of-breed open source C and Fortran libraries for ... 
  10. ^ "Non-GPL Julia?". Groups.google.com. Retrieved 2017-05-31. 
  11. ^ "Introduce USE_GPL_LIBS Makefile flag to build Julia without GPL libraries". Note that this commit does not remove GPL utilities such as git and busybox that are included in the Julia binary installers on Mac and Windows. It allows building from source with no GPL library dependencies. 
  12. ^ a b "Introduction". The Julia Manual. Read the Docs. Retrieved 6 December 2016. 
  13. ^ "Programming Language Network". GitHub. Retrieved 6 December 2016. 
  14. ^ "JuliaCon 2016". JuliaCon. Retrieved 6 December 2016. He has co-designed the programming language Scheme, which has greatly influenced the design of Julia 
  15. ^ "The Julia Language" (official website). 
  16. ^ Bryant, Avi (15 October 2012). "Matlab, R, and Julia: Languages for data analysis". O'Reilly Strata. 
  17. ^ Krill, Paul (18 April 2012). "New Julia language seeks to be the C for scientists". InfoWorld. 
  18. ^ Finley, Klint (3 February 2014). "Out in the Open: Man Creates One Programming Language to Rule Them All". Wired. 
  19. ^ "Escher : With Escher you can build beautiful Web Uls entirely in Julia". Shasi.github.io. Retrieved 2017-05-31. 
  20. ^ "Getting Started with Node Julia · Node Julia". Node-julia.readme.io. Retrieved 2017-05-31. 
  21. ^ Moss, Robert (26 June 2015). "Using Julia as a Specification Language for the Next-Generation Airborne Collision Avoidance System". Archived from the original on 1 July 2015. Retrieved 29 June 2015. Airborne collision avoidance system 
  22. ^ "Suspending Garbage Collection for Performance...good idea or bad idea?". Groups.google.com. Retrieved 2017-05-31. 
  23. ^ Jeff Bezanson, Stefan Karpinski, Viral Shah, Alan Edelman. "Why We Created Julia". JuliaLang.org. Retrieved 5 June 2017. 
  24. ^ "JuliaCon 2017". juliacon.org. Retrieved 2017-06-04. 
  25. ^ Fisher, Keno. "The Celeste Project". juliacon.org. Retrieved 24 June 2017. 
  26. ^ Regier, Jeffrey; Pamnany, Kiran; Giordano, Ryan; Thomas, Rollin; Schlegel, David; McAulife, Jon; Prabat. "Learning an Astronomical Catalog of the Visible Universe through Scalable Bayesian Inference". arxiv.org. Retrieved 24 June 2017. 
  27. ^ "PyCall.jl". stevengj. github.com. 
  28. ^ "Using PyCall in julia on Ubuntu with python3". julia-users at Google Groups. to import modules (e.g. python3-numpy) 
  29. ^ a b "Polyglot.jl". wavexx. github.com. 
  30. ^ "python interface to julia". 
  31. ^ a b "Learn Julia in Y Minutes". Learnxinyminutes.com. Retrieved 2017-05-31. 
  32. ^ "Build a standalone executables from a Julia script". 
  33. ^ ".jl to .exe". Groups.google.com. Retrieved 2017-05-31. 
  34. ^ Getting Started
  35. ^ See also: http://julia.readthedocs.org/en/stable/manual/strings/ for string interpolation and the string(greet, ", ", whom, ".\n") example for preferred ways to concatenate strings. Julia has the println and print functions, but also a @printf macro (i.e. not in function form) to eliminate run-time overhead of formatting (unlike the same function in C).
  36. ^ "Julia Documentation". JuliaLang.org. Retrieved 18 November 2014. 
  37. ^ "Project Jupyther". 
  38. ^ "Chris Lattner discusses the name LLVM". Retrieved 22 December 2011. 
  39. ^ "Julia: A Fast Dynamic Language for Technical Computing" (PDF). 2012. 
  40. ^ Gibbs, Mark (9 January 2013). "Pure and Julia are cool languages worth checking out". Network World (column). Retrieved 7 February 2013. 
  41. ^ https://julialang.org/blog/2017/06/julia-0.6-release
  42. ^ "Support MCJIT". Github.com. Retrieved 26 May 2015. 
  43. ^ "Using MCJIT with the Kaleidoscope Tutorial". 22 July 2013. Retrieved 26 May 2015. The older implementation (llvm::JIT) is a sort of ad hoc implementation that brings together various pieces of the LLVM code generation and adds its own glue to get dynamically generated code into memory one function at a time. The newer implementation (llvm::MCJIT) is heavily based on the core MC library and emits complete object files into memory then prepares them for execution. 
  44. ^ "julia/README.md at v0.5.2 · JuliaLang/julia · GitHub". Github.com. 2017-05-03. Retrieved 2017-05-31. 
  45. ^ JuliaLang. "julia/README.arm.md at v0.5.2 · JuliaLang/julia · GitHub". Github.com. Retrieved 2017-05-31. 
  46. ^ "Cross-compiling for ARMv6". Retrieved 16 May 2015. I believe #10917 should fix this. The CPU used there arm1176jzf-s. 
  47. ^ "ARM build failing during bootstrap on Raspberry Pi 2". Retrieved 16 May 2015. I can confirm (FINALLY) that it works on the Raspberry Pi 2 [..] I guess we can announce alpha support for arm in 0.4 as well. 
  48. ^ "Julia available in Raspbian on the Raspberry Pi". recommend using the Pi 3. 
  49. ^ "Julia language for Raspberry Pi". Raspberry Pi Foundation. 
  50. ^ https://github.com/JuliaLang/julia/blob/master/README.arm.md
  51. ^ https://github.com/JuliaLang/julia/issues/10791#issuecomment-91735439
  52. ^ https://github.com/JuliaLang/julia/labels/Power
  53. ^ "Porting Julia to PowerPC". Retrieved 9 May 2015. Wow, the latest git allows me to build to completion. 
  54. ^ "IBM Power port". I am hoping we can have beta support from the 0.5 release onwards for sequential julia. We were able to do this work due to some small support from IBM. 
  55. ^ "Fix building tests on GNU/kFreeBSD and GNU/Hurd by ginggs · Pull Request #129 · JuliaLang/openlibm". Github.com. Retrieved 2017-05-31. 
  56. ^ "Interfaces for Abstract Types Issue #6975". Github.com. Retrieved 2017-05-31. 
  57. ^ "julia/j2c at j2c · IntelLabs/julia". Github.com. Retrieved 2017-05-31. 
  58. ^ "Julia2C initial release". By translating Julia to C, we leverage the high-level abstractions (matrix, vector, ..), which are easier to analyze, and can potentially add the rich extensions of C (like openmp, tbb, ...).

    The tool may also extend Julia to new architectures where the only available tool chain is for C
    [..]
    Translation from C to Julia might be harder.
     
  59. ^ "The ParallelAccelerator package, part of the High Performance Scripting project at Intel Labs". Intel Labs. 
  60. ^ Lindsey Kuper (2016-03-01). "An introduction to ParallelAccelerator.jl". JuliaLang.org. Retrieved 2017-05-31. 
  61. ^ https://juliacomputing.com/press/2017/06/19/funding.html

External links[edit]