.comment-link {margin-left:.6em;}

2Math

Wednesday, June 25, 2008

Acceleware Accelerates Matrix Computation up to 20 Times

Calgary, Canada based Acceleware develops and markets acceleration products that bring performance and speed to today's most strenuous and challenging computational and/or data-intensive demands.With Acceleware's products, reduction in run times for data processing and simulation applications by more than 35 times has been observed.

The company's Accelerator™ Board and ClusterInABox™ Workstation solutions deliver a competitive advantage for companies where compute and data-intensive applications are central to the performance of product development cycles by radically enhancing the ability to effectively interpret vast amounts of data. The company's target markets include names in mainly following industries : electromagnetic, energy, biomedical, pharmaceuticals, industrial, and military.

Last Week, Acceleware has announced the acceleration of matrix equation solvers found in simulation and data processing software commonly used for scientific and industrial applications.

These new methods significantly extend the problem solving capability of the Acceleware Platform, allowing a broad new range of application providers the opportunity to deliver the compelling performance of multi-core and hardware accelerators to their technical computing customers. With matrix methods added into the Acceleware Platform, end users of these advanced techniques can benefit by solving their linear equation problems up to 20 times faster.

‘The technical computing market has experienced explosive growth over the past four years at 20% a year, and is expected to continue in a high growth mode. Users running multi-physic codes, electromagnetic, mechanical, fluid dynamics, EDA and oil/gas codes are driving this growth and require major speed-ups on their applications to gain a competitive advantage. At the same time, processors and systems are becoming more complex, making it harder to obtain the required speed-ups. Linear solvers are used ubiquitously in scientific and industry applications, simulation, and data processing, so speed-ups in this area directly helps end users, providing faster time-to-market and increased scientific insights,’ said Earl C Joseph, program vice president of HPC at IDC.

Acceleware’s matrix equation solver technology will be applicable for users of various electronic design automation, mechanical, fluid dynamics, microwave, photonics, signal integrity/power integrity tools and integrated circuit software tools, significantly reducing the simulation time in their design flow. This allows semiconductor, wireless handset, and consumer and industrial product manufacturers to bring higher-performance, feature-rich products to market sooner, giving them a competitive edge and transforming the way they solve problems. Oil and gas companies also use matrix methods to solve larger, more complicated reservoir simulations and to help pin-point new oil reserves faster, and with higher resolution.

By developing this matrix equation solver technology, which is based on Nvidia's CUDA C-language development environment, the reach of the Acceleware Platform extends into new markets, adding incremental value to existing users of Acceleware solutions. Acceleware’s latest capability speeds up large, sparse linear equation computational algorithms by up to 20 times.

For more information, visit Acceleware webpage for its products.

Thursday, June 12, 2008

ACEnet & Mechdyne launch 'Data Cave' for 3-D Visualization of Complex Systems like Stars

Over the last five years, universities across Canada have been organizing themselves into regional groups to acquire and operate large-scale high performance computing (HPC) facilities for research. ACEnet ("Atlantic Computational Excellence Network") is Atlantic Canada's entry into this national fabric of HPC facilities. It is a partnership of seven institutions, including Memorial University of Newfoundland, University of New Brunswick, Mount Allison University, Dalhousie University, St. Francis Xavier University, St. Mary's University and the University of Prince Edward Island.

Scientific researchers in the ACEnet now have access to advanced three-dimensional (3D) visualization capability in a 5.8 Megapixel Data Cave designed and installed by Mechdyne. [Mechdyne Corporation, headquartered in Marshalltown, Iowa, USA, is the world’s largest company dedicated to consulting and development of immersive, networked and collaborative visualization systems]

On June 10, the ACEnet launched 'Data Cave', which projects stereoscopic 3D imagery onto three wall surfaces and the floor, allows researchers to step inside and interact with computer-generated images in ways that can provide new insight and understanding.

At the grand opening, representatives from the nine ACEnet member universities and other VIPs experienced the immersive visualization for complex scientific concepts, such as an astrophysicists’ calculations of how light and heat swirl away from an exploding star. As a scientific tool, the Data Cave will save scientists a lot of time and improves their ability to interpret the results of extremely complex calculations. Conventional visualization of a star, for example, would create two-dimensional slices that do not really convey useful information. With immersive visualisation, they can better see the 3D turbulence and other characteristics of a star, and can see the effect of modifications in the data or in their calculations much more quickly.

The Data Cave display system uses four Mechdyne Beacon SX+ projectors to generate a 1,450 x 1,050 pixel image on each of four screens measuring 10 ft. X 7.5 ft. The image is seen in stereo by users wearing LCD active shutter glasses. An integrated real time motion tracking system monitors the position of a primary user, maintaining an optimal viewing perspective and allowing interaction with the images using handheld control devices.

Since the Data Cave is based on the Mechdyne Flex display system design, it also can be easily configured as a 30ft long flat screen or as an angled ‘immersive theatre’ in addition to the cubic Cave format. The tracking system covers the entire 30ft width of the flat screen, three times the usual tracked space of immersive environments. This flexibility allows ACEnet scientists, such as geophysicists, to display data in formats commonly used in such fields as oil exploration and development.

Mechdyne worked closely with Sun Microsystems, which as ACEnet’s computing systems partner provided four Ultra 40 Workstations as the graphics engines for the display system. Mechdyne also collaborated with Iowa State University, a pioneer in using Cave-type display systems, to provide 3D visualisation software for the ACEnet Data Cave.

Friday, April 25, 2008

Mapping the Math in Music

The figure shows how geometrical music theory represents four-note chord-types -- the collections of notes form a tetrahedron, with the colors indicating the spacing between the individual notes in a sequence. In the blue spheres, the notes are clustered, in the warmer colors, they are farther apart. The red ball at the top of the pyramid is the diminished seventh chord, a popular 19th-century chord. Near it are all the most familiar chords of Western music [Image: Dmitri Tymoczko]

More than 2000 years ago Pythagoras reportedly discovered that pleasing musical intervals could be described using simple ratios. And the so-called musica universalis or "music of the spheres" emerged in the Middle Ages as the philosophical idea that the proportions in the movements of the celestial bodies -- the sun, moon and planets -- could be viewed as a form of music, inaudible but perfectly harmonious.

Now, three music professors – Clifton Callender at Florida State University, Ian Quinn at Yale University and Dmitri Tymoczko at Princeton University -- have devised a new way of analyzing and categorizing music that takes advantage of the deep, complex mathematics they see enmeshed in its very fabric.

In a recent article in 'Science', the trio has outlined a method called "geometrical music theory" that translates the language of musical theory into that of contemporary geometry. They take sequences of notes, like chords, rhythms and scales, and categorize them so they can be grouped into "families." They have found a way to assign mathematical structure to these families, so they can then be represented by points in complex geometrical spaces, much the way "x" and "y" coordinates, in the simpler system of high school algebra, correspond to points on a two-dimensional plane. Different types of categorization produce different geometrical spaces, and reflect the different ways in which musicians over the centuries have understood music.

This achievement, they expect, will allow researchers to analyze and understand music in much deeper and more satisfying ways. The method, according to its authors, allows them to analyze and compare many kinds of Western (and perhaps some non-Western) music. (The method focuses on Western-style music because concepts like "chord" are not universal in all styles.) It also incorporates many past schemes by music theorists to render music into mathematical form."The music of the spheres isn't really a metaphor -- some musical spaces really are spheres," said Tymoczko, an assistant professor of music at Princeton. "The whole point of making these geometric spaces is that, at the end of the day, it helps you understand music better. Having a powerful set of tools for conceptualizing music allows you to do all sorts of things you hadn't done before."

The work represents a significant departure from other attempts to quantify music, according to Rachel Wells Hall of the Department of Mathematics and Computer Science at St. Joseph's University in Philadelphia. In an accompanying essay, she writes that their effort, "stands out both for the breadth of its musical implications and the depth of its mathematical content."

Reference
"Generalized Voice-Leading Spaces"
Clifton Callender, Ian Quinn, Dmitri Tymoczko,
Science, Vol. 320. no. 5874, pp. 346 - 348 (18 April 2008),
Abstract Link

[This posting is based on a press release by Princeton University]

Thursday, April 17, 2008

Supercomputer Simulates Merger of Three Black Holes

Simulated paths of three black holes merging. (Image courtesy: Rochester Institute of Technology)

The same team of astrophysicists that cracked the computer code simulating two black holes crashing and merging together has now, for the first time, caused a three-black-hole collision.

Manuela Campanelli, Carlos Lousto and Yosef Zlochower—scientists in Rochester Institute of Technology’s Center for Computational Relativity and Gravitation—simulated triplet black holes to test their breakthrough method that, in 2005, merged two of these large mass objects on a supercomputer following Einstein’s theory of general relativity.

The new simulation of multiple black holes evolving, orbiting and eventually colliding confirmed a robust computer code free of limitations. The May issue of Physical Review D will publish the team’s latest findings in the article “Close Encounters of Three Black Holes,” revealing the distinct gravitational signature three black holes might produce. The story will run under the “Rapid Communications” section.

“We discovered rich dynamics leading to very elliptical orbits, complicated orbital dynamics, simultaneous triple mergers and complex gravitational waveforms that might be observed by gravitational wave detectors such as LIGO and LISA,” says Lousto, professor in RIT’s School of Mathematical Sciences. “These simulations are timely because a triple quasar was recently discovered by a team led by Caltech astronomer George Djorgovski. This presumably represents the first observed supermassive black hole triplet.”

The RIT team’s triple merger simulates the simplest case of equal masses and nonspinning black holes, a prerequisite for exploring configurations of unequal masses and different spins and rotations. The center’s supercomputer cluster “newHorizons” processed the simulations and performed evolutions of up to 22 black holes to verify the results.

“Twenty-two is not going to happen in reality, but three or four can happen,” says Yosef Zlochower, an assistant professor in the School of Mathematical Sciences. “We realized that the code itself really didn’t care how many black holes there were. As long as we could specify where they were located—and had enough computer power—we could track them.”

Specially designed high-performance computers like newHorizons are essential tools for scientists like Campanelli’s team who specialize in computational astrophysics and numerical relativity, a research field dedicated to proving Einstein’s theory of general relativity. Only supercomputers can simulate the force of impact necessary to generate gravity waves—warps in space-time that might provide clues to the origin of the universe.

Scientists expect to measure actual gravity waves for the first time within the next decade using the ground-based detector known as the Laser Interferometer Gravitational Wave Observatory (LIGO) and the future NASA/European Space Agency space mission Laser Interferometer Space Antenna (LISA).

“In order to confirm the detection of gravitational waves, scientists need the modeling of gravitational waves coming from space,” says Campanelli, director of RIT’s Center for Computational Relativity and Gravitation. “They need to know what to look for in the data they acquire otherwise it will look like just noise. If you know what to look for you can confirm the existence of gravitational waves. That’s why they need all these theoretical predictions.”

Adds Lousto: “Gravity waves can also confirm the existence of black holes directly because they have a special signature. That’s what we’re simulating. We are predicting a very specific signature of black hole encounters. And so, if we check that, there’s a very strong evidence of existence of black holes.”

For more information about RIT’s Center for Computational Relativity and Gravitation, visit CCRG. To see a visualization by CCRG member Hans-Peter Bischof tracing the interaction of a trio of same-sized masses, click on CCRG Movies.

Facts about newHorizons:

The “newHorizons” computer cluster at RIT’s Center for Computational Relativity and Gravitation boasts 85 nodes—each with its own dual processor—with four amounts of computing units per node, high-speed Infiniband interconnections and 36 terabytes of storage space.

The standard Mac desktop computer has 2 gigabytes of memory. By comparison, each node in newHorizons has 16 gigabytes or a total of 1.4 terabytes of memory. In addition, infinite band technology makes the computer especially fast, moving “packages” of information with a lag time or latency of 1.29 microseconds. The high- performance computer built with hardware from California-based Western Scientific will operate at its maximum capacity 24 hours a day for four to five years.

This posting is based on materials provided by Rochester Institute of Technology.