Johns Hopkins Redefining Supercomputer Building
Johns Hopkins is redefining the rules of building supercomputers.
This approach will be well-suited to the kind of data mining-oriented scientific workloads processed by today's supercomputers. Alexander Szalay, a computer scientist and astrophysicist at Johns Hopkins' Institute for Data Intensive Engineering and Science, who is leading the project, dubbed Data-Scope, explained:
For the sciences, it is the I/O that is becoming the major bottleneck. People are running larger and larger simulations, and they take up so much memory, it is difficult to write the output to disk.
Data-Scope has received backing from the National Science Foundation in the form of a $2.1 million grant. When complete, Data-Scope will be able to handle five petabytes of data. Data-Scope will also allow Szalay, a host of researchers at Johns Hopkins and other institutions, including universities and national laboratories, to conduct research directly in the database, reports SpaceDaily.com.