Contents
Proceedings of SciPy 2009
SciPy 2009, the 8th annual Scientific Computing with Python conference, was held August 18 - 23, 2009 in Pasadena, California. 14 peer reviewed articles were published in the conference proceedings. Full proceedings and organizing committee can be found at https://
Progress Report: NumPy and SciPy Documentation in 2009
Progress Report: NumPy and SciPy Documentation in 2009
In the Spring of 2008, the SciPy Documentation Project began to write documentation for NumPy and SciPy. Starting from 8658 words, the NumPy reference pages now have over 110,000 words, producing an 884-page PDF document.
Joseph Harrington, David Goldsmith
Neutron-scattering data acquisition and experiment automation with Python
Neutron-scattering data acquisition and experiment automation with Python
PyDas is a set of Python modules that are used to integrate various components of the Data Acquisition System at Spallation Neutron Source (SNS). PyDas enables customized automation of neutron scattering experiments in a rapid and flexible manner. It provides wxPython-based GUIs for routine experiments as well as IPython command line scripting environment.
Piotr A. Zolnierczuk, Richard E. Riedel
Multiprocess System for Virtual Instruments in Python
Multiprocess System for Virtual Instruments in Python
Programs written for controlling laboratory equipment and interfacing numerical calculations share the need for a simple graphical user interface (GUI) frontend and a multithreaded or multiprocess structure to allow control and data display to remain usable while other actions are performed. We introduce Pythics, a system for running "virtual instruments", which are simple programs typically used for data acquisition and analysis. Pythics provides a simple means of creating a virtual instrument and customizing its appearance and functionality without the need for toolkit specific knowledge. It utilizes a robust, multiprocess structure which separates the GUI and the back end of each instrument to allow for effective usage of system resources without sacrificing functionality.
Brian D'Urso
Nitime: time-series analysis for neuroimaging data
Nitime: time-series analysis for neuroimaging data
Nitime is a library for the analysis of time-series developed as part of the Nipy project, an effort to build open-source libraries for neuroimaging research. We briefly describe functional neuroimaging and some of the unique considerations applicable to time-series analysis of data acquired using these techniques, and provide examples of using nitime to analyze both synthetic data and real-world neuroimaging time-series.
Ariel Rokem, Michael Trumpis, Fernando Pérez
Exploring the future of bioinformatics data sharing and mining with Pygr and Worldbase
Exploring the future of bioinformatics data sharing and mining with Pygr and Worldbase
Worldbase is a virtual namespace for scientific data sharing that can be accessed via `from pygr import worldbase`. Worldbase enables users to access, save and share complex datasets as easily as simply giving a specific name for a commonly-used dataset
Christopher Lee, Alexander Alekseyenko, C. Titus Brown
The FEMhub Project and Classroom Teaching of Numerical Methods
The FEMhub Project and Classroom Teaching of Numerical Methods
We introduce briefly the open source project FEMhub and focus on describing how it can be used for live demonstrations of elementary numerical methods in daily classroom teaching.
Pavel Solin, Ondrej Certik, Sameer Regmi
Sherpa: 1D/2D modeling and fitting in Python
Sherpa: 1D/2D modeling and fitting in Python
Sherpa is a modern, general purpose fitting and modeling application available in Python. It contains a set of robust optimization methods that are critical to the forward fitting technique used in parametric data modeling. The Python implementation provides a powerful software package that is flexible and extensible with direct access to all internal data objects.
Brian L. Refsdal, Stephen M. Doe, Dan T. Nguyen, +11
PMI - Parallel Method Invocation
PMI - Parallel Method Invocation
The Python module `pmi` (Parallel Method Invocation) is
presented. It allows users to write simple, non-parallel Python scripts that use functions and classes that are executed in parallel.
The module is well suited to be employed by other modules and packages that want to provide functions that are executed in parallel. The user of such a module does not have to write a parallel script, but can still profit from parallel execution.
Olaf Lenz
PaPy: Parallel and distributed data-processing pipelines in Python
PaPy: Parallel and distributed data-processing pipelines in Python
PaPy, which stands for parallel pipelines in Python, is a highly flexible framework that enables the construction of robust, scalable workflows for either generating or processing voluminous datasets. The simplicity and flexibility of distributed workflows using PaPy bridges the gap between desktop -> grid, enabling this new computing paradigm to be leveraged in the processing of large scientific datasets.
Marcin Cieślik, Cameron Mura
Parallel Kernels: An Architecture for Distributed Parallel Computing
Parallel Kernels: An Architecture for Distributed Parallel Computing
Global optimization problems can involve huge computational resources. The need to prepare, schedule and monitor hundreds of runs and interactively explore and analyze data is a challenging problem. Managing such a complex computational environment requires a sophisticated software framework which can distribute the computation on remote nodes hiding the complexity of the communication in such a way that scientist can concentrate on the details of computation. We present PARK, the computational job management framework being developed as a part of DANSE project, which will offer a simple, efficient and consistent user experience in a variety of heterogeneous environments from multi-core workstations to global Grid systems. PARK will provide a single environment for developing and testing algorithms locally and executing them on remote clusters, while providing user full access to their job history including their configuration and input/output. This paper will introduce the PARK philosophy, the PARK architecture and current and future strategy in the context of global optimization algorithms.
P. A. Kienzle, N. Patel, M. McKerns
Convert-XY: type-safe interchange of C++ and Python containers for NumPy extensions
Convert-XY: type-safe interchange of C++ and Python containers for NumPy extensions
We present Convert-XY: a new, header-only template library for converting containers between C++ and Python with a simple, succinct syntax. At compile-time, template-based recursive pattern matching is performed on the static structure of the C++ type to build dynamic type checkers and conversion functions.
Damian Eads, Edward Rosten
High-Performance Code Generation Using CorePy
High-Performance Code Generation Using CorePy
We present the CoreFunc framework, which utilizes CorePy to provide an environment for applying element-wise arithmetic operations (such as addition) to arrays and achieving high performance while doing so. To evaluate the framework, we develop and experiment with several ufunc operations of varying complexity. Our results show that CoreFunc is an excellent tool for accelerating NumPy-based applications.
Andrew Friedley, Christopher Mueller, Andrew Lumsdaine
Fast numerical computations with Cython
Fast numerical computations with Cython
We discuss Cython's features for fast NumPy array access in detail through examples and benchmarks. Using Cython to call natively compiled scientific libraries as well as using Cython in parallel computations is also given consideration. We conclude with a note on possible directions for future Cython development.
Dag Sverre Seljebotn
Cython tutorial
Cython tutorial
We describe the Cython language and show how it can be used both to write optimized code and to interface with external C libraries.
Stefan Behnel, Robert W. Bradshaw, Dag Sverre Seljebotn
Editorial
Editorial
SciPy 2009 marks our eighth annual Python in Science conference and the second edition of the conference proceedings. The conference and these proceedings highlight the ongoing focus of the community on providing practical software tools, created to address real scientific problems.
Gael Varoquaux, Stéfan van der Walt, Jarrod Millman