Work with your data
Your research data is an important asset. It requires accessible documentation, secure storage, standardised organisation, reproducible analysis and impactful visualisations. Griffith has the software, computational tools and support you need to find, capture, process, analyse and visualise your research data to meet publication requirements.
Capture data
Capturing research data is a critical first step in creating reliable, reproducible studies. Clear documentation and comprehensive metadata—information describing your data—makes your work easier to find, interpret, reuse and cite.
During data collection follow these best practices.
- Document your data and processes to ensure transparency and context.
- Use open formats to support long-term access and verification.
- Organise and describe data at the point of creation to preserve integrity.
- Capture high-resolution data using appropriate technology.
- Record ownership, IP, privacy, and consent information clearly.
- Use tools and processes to:
- track data provenance—origin, transformations, and history
- record quality metadata at the point of capture
- structure data in open, flexible formats
- ensure compliance with privacy, ethics and consent agreements.
Use these best practice guides to document your data.
Survey tools at Griffith
Analyse data
Before analysis, ensure your data is clean and consistent by identifying and correcting errors, removing duplicates, reformatting, merging or splitting datasets and performing other wrangling tasks.
Ensure your research can be successfully replicated by others and yourself by organising, naming, versioning and documenting data and files using standard methods and consistent formats.
The following tools can help prepare your data:
- OpenRefine for tabular data
- Microsoft Excel
- R or Python.
Analyse data with these tools:
- Gale Digital Scholar Lab— analyse text from historical primary source collections
- ArcGIS—geographic information and mapping
- Leximancer—textual analysis and visualisation
- Nvivo—qualitative analysis of texts
- MATLAB —mathematics and technical computing
- STATA—statistics and data science
- SPSS —statistics.
Find these and other tools via the Software catalogue.
Analyse large datasets with the power of Griffith's High Performance Computing (HPC).
Find a suitable visualisation for your data by exploring these tools:
Use these open-source tools to visualise data:
Publishers require charts, graphs and other images to be submitted as separate files with your article submission. Read each publisher’s specific image submission requirements to identify acceptable file formats, resolution size, captioning and other details.
Use these tools to convert images exported from Microsoft Excel to publication-quality image files.
- Adobe Photoshop via CreativeCloud (Windows – Staff download)
- Preview (Mac OS )
- Pixillion Image Converter (Windows – free download)
- PhotoPad (Windows - free download).
Develop your skills
Learn from the Programming Historian how to use digital tools, techniques and workflows that help facilitate research in any discipline.
Attend a library workshop or enrol in one of our tutorials. We have various resources targeted to support you through every stage of the research lifecycle.
Find tools and tutorials that help you work with data from galleries, libraries, archives and museums.
Explore the various workshops, tutorials and self-help resources offered by eResearch Services.
Questions
We are here to help!
Find us in the libraries or contact us by phone or online.
Copyright matters
Find information and support for all aspects of your copyright compliance obligations.
eResearch services
For help with your research technology, data science and technical infrastructure needs.
Office for research
For help with research grant funding opportunities, ethical and IP matters.
Workshops
Attend a workshop targeted to support you throughout the research lifecycle.