Work with your data

Your research data is an important asset. It requires accessible documentation, secure storage, standardised organisation, reproducible analysis and impactful visualisations. Griffith has the software, computational tools and support you need to find, capture, process, analyse and visualise your research data to meet publication requirements.

Capture data

Capturing research data is a critical first step in creating reliable, reproducible studies. Clear documentation and comprehensive metadata—information describing your data—makes your work easier to find, interpret, reuse and cite.

During data collection follow these best practices.

  • Document your data and processes to ensure transparency and context.
  • Use open formats to support long-term access and verification.
  • Organise and describe data at the point of creation to preserve integrity.
  • Capture high-resolution data using appropriate technology.
  • Record ownership, IP, privacy, and consent information clearly.
  • Use tools and processes to:
    • track data provenance—origin, transformations, and history
    • record quality metadata at the point of capture
    • structure data in open, flexible formats
    • ensure compliance with privacy, ethics and consent agreements.

Use these best practice guides to document your data.

UK data service

MIT libraries

Survey tools at Griffith

REDCap

Qualtrics

Analyse data

Before analysis, ensure your data is clean and consistent by identifying and correcting errors, removing duplicates, reformatting, merging or splitting datasets and performing other wrangling tasks.

Ensure your research can be successfully replicated by others and yourself by organising, naming, versioning and documenting data and files using standard methods and consistent formats.

The following tools can help prepare your data:

Analyse data with these tools:

  • Gale Digital Scholar Lab— analyse text from historical primary source collections
  • ArcGIS—geographic information and mapping
  • Leximancer—textual analysis and visualisation
  • Nvivo—qualitative analysis of texts
  • MATLAB —mathematics and technical computing
  • STATA—statistics and data science
  • SPSS —statistics.

Find these and other tools via the Software catalogue.

Analyse large datasets with the power of Griffith's High Performance Computing (HPC).

Find a suitable visualisation for your data by exploring these tools:

Use these open-source tools to visualise data:

  • Voyant tools—reading and analysis for digital texts
  • RawGraphs—create visualisations for complex data
  • Gephi—network analysis and visualisations
  • Cytoscape—complex network analysis and visualisations
  • R or Python—to build with custom code.

Publishers require charts, graphs and other images to be submitted as separate files with your article submission. Read each publisher’s specific image submission requirements to identify acceptable file formats, resolution size, captioning and other details.

Use these tools to convert images exported from Microsoft Excel to publication-quality image files.

Develop your skills

Learn from the Programming Historian how to use digital tools, techniques and workflows that help facilitate research in any discipline.

View tutorials

Attend a library workshop or enrol in one of our tutorials. We have various resources targeted to support you through every stage of the research lifecycle.

Browse training options

Find tools and tutorials that help you work with data from galleries, libraries, archives and museums.

Explore the GLAM workbench

Explore the various workshops, tutorials and self-help resources offered by eResearch Services.

Research computing training and support

Questions

We are here to help!

Find us in the libraries or contact us by phone or online.