Core Facilities: Neuroimaging
Neuroimaging Home | Conducting Imaging Studies | Neuroimaging Compute Facility | MRI Simulator | Education | Neuroimaging Contacts | Internal

Neuroimaging Compute Facility

Recent advances in neuroimaging have greatly enhanced understanding of the brain and mind. With these developments come continually growing computational demands and challenges.  From increases in the sheer volume of data due to improvements in spatial and temporal resolution, to the complexities of integrating neuroimaging data sets derived from multiple imaging modalities with behavioral and genetic information, the computational requirements for teaching, data analysis, storage, visualization, sharing/availability and security demand ever advancing resources and expertise. 

The Neuroimaging Compute Facility (NCF) is a central enabling infrastructure for neuroimaging teaching and research whose mission is to provide high performance, high power, robust, reliable and secure computer systems and human expertise to meet the challenges of neuroimaging research and teaching. Students and researchers interested in using the NCF must sign up for a User Account. The NCF is a collaboration between the Center for Brain Science and the Life Sciences Division of Research Computing.

For answers to frequently asked questions about using the NCF, click below:




120 CPU high performance compute cluster
The main computational server includes 30 Dell 1955 Blade Servers that each house two dual-core 2.66GHz processors. Twenty-six of these nodes have 8G of RAM, while the remaining four have16G of RAM for high memory jobs. They are configured to run Linux and Solaris operating systems.


High volume, high performance disk space
An EMC Clariion CX3-40 currently provides over 54TB of usable, RAID-protected disk space for the user community, expandable to 150+ TB.  The Clariion is fronted by an EMC NSX gateway that allows file level access via NFS.


Secure data backup and archive
Tape backup (regular running snapshop of data) and archive (permanent storage) is provided by a ADIC Scalar I2000 tape Backup System. The current capacity of 80TB of backup space is easily expandable to over 600TB.


Secure, fast network access
Connectivity to the cluster and storage is provided via 10G links on an enterprise class Cisco 6509 switch.  Backup data is routed separately over Cisco MDS 9120 SAN switches. The network is secure and protected by firewall.




Data Analysis:

Common tools for data analysis are available and updated regularly. Licenses and software packages for data analysis include Analyze, FreeSurfer, FIV, FSL, MatLab, and SPM. In addition, a developing shared resource will house common processing scripts and tutorials for neuroimaging data analysis, including merged atlases for analysis of older adults and children, as well as innovative procedures for high resolution data analysis and surface visualization. To access these software tools, register for an NCF computer account.

Neuroinformatics tools:

– The eXtensible Neuroimaging Archive Toolkit
The XNAT database provides infrastructure for neuroimaging data management and data sharing (

oasis - Open Access Series of Imaging Studies
Open access data for teaching and research: Freely available data provides opportunities for those learning to explore analysis tools and become familiar with neuroimaging research. Such data are also important to methods development and novel discovery. A continually growing data library is provided for the Harvard community and other interested individuals (


  • For assistance with IT related issues (including registering your computer with the Harvard network, workstations, wireless concerns etc.), or questions about using the NCF, contact:
    Please provide as much detail as possible so that the Helpdesk staff can triage your problem quickly.
  • Request a new NCF User Account


revised 06/2008 jmg