Skip to content

Archiving Your Data with DaSCH

Archiving Research Data with DaSCH

Archiving research data with DaSCH follows a structured workflow. DaSCH provides expert support at every step, tailored to each project's specific needs.

Support Throughout the Process

Researchers with data to archive can contact DaSCH at info@dasch.swiss for:

  • Discussion of data and project goals
  • Assessment of technical requirements and available resources
  • Recommendations on the most suitable archiving approach
  • Guidance throughout the entire archiving process

DaSCH supports researchers from diverse disciplines and technical backgrounds, providing tailored guidance appropriate to each project's needs.

What DaSCH Provides

Long-Term Preservation

The DaSCH Service Platform (DSP) serves as a secure repository designed for humanities research data. DaSCH specializes in preserving data while ensuring long-term accessibility:

Preserved content includes: Structured data, files (images, documents, audio, video), and static websites (pre-built HTML/CSS/JavaScript)

Note: DaSCH focuses on data preservation rather than dynamic web applications with database dependencies

Access Options

The DaSCH Service Platform offers two primary access methods:

  • DSP-APP: Web-based interface for viewing and editing data directly in the browser, ideal for interactive work
  • DSP-API: Programmatic access for integration with other applications and automated workflows

Both access methods work with structured data organized according to a data model that defines data elements and their relationships.

Expert Guidance

The DaSCH team provides consultation, technical guidance, training, and ongoing support throughout the archiving process and beyond.

The Archiving Process

The archiving process is adaptable to each project's needs. The typical workflow follows these steps:

Step 1: Data Model Creation

What is a data model? A formal description of how data is structured, similar to a blueprint for organizing information.

A data model consists of:

  • Resource Classes: The types of entities in the data (e.g., Book, Author, Letter)
  • Properties: Information about each resource class (e.g., hasTitle, hasPublicationDate, linkToAuthor)
  • Cardinalities: Rules defining which properties are required and how many values are allowed (e.g., one hasTitle required, multiple linkToAuthor allowed)
  • Lists: Controlled vocabularies providing predefined options (e.g., hasColor limited to "green", "blue", or "red")

DaSCH supports this step by:

  • Facilitating discussion of research data organization
  • Working with researchers to define appropriate data structures
  • Incorporating mappings to reference ontologies for better interoperability
  • Providing expertise in data modeling standards and best practices

See also: Naming Conventions

Step 2: Data Preparation

What typically needs preparation:

  • Files: Images, documents, audio, video, and other digital objects
  • Data: Descriptions and relationships, often from spreadsheets or databases
  • Documentation: Explanations of how files and data map to the data model

DaSCH provides:

  • Clear guidance on file format requirements
  • Naming convention standards
  • Custom script development for data cleaning and transformation
  • Iterative refinement support

Related resources: Supported File Formats | Naming Conventions | Anonymization

Step 3: Testing with a Test Server

Researchers receive a dedicated test server as a secure environment for experimentation and refinement.

Test server setup includes:

  • Environment configuration by DaSCH
  • Data model upload or creation assistance
  • Sample data entry capability

Available working methods:

  • Manual entry in DSP-APP: Create and edit data directly in the browser, suitable for smaller datasets or hands-on control preferences
  • Bulk import with DSP-TOOLS: Import large datasets efficiently using structured files (JSON, XML) with DaSCH team assistance
  • Combination approach: Use both methods as needed

Researchers can review, test, and provide feedback. If adjustments are needed, the data model is refined through discussion and iteration. This process continues until the results meet project requirements.

Note: Test servers are designed for experimentation.

Step 4: Metadata and Formal Requirements

Every dataset requires comprehensive metadata to help other researchers discover, understand, and properly cite the work.

DaSCH provides:

  • Clear metadata templates
  • Guidance on required information
  • Explanation of copyright and licensing options
  • Deposit agreement preparation

Metadata becomes publicly searchable on meta.dasch.swiss, ensuring research discoverability.

Learn more: Copyright & Licenses | FAIR and CARE Principles

Step 5: Production Deployment

Once test results meet project requirements, DaSCH handles the technical deployment:

DaSCH deployment process:

  • Upload finalized data model to production server
  • Import complete dataset
  • Configure access permissions according to project specifications
  • Ensure proper metadata publication

Researchers receive:

  • Full data access through DSP-APP
  • Permanent identifiers (ARKs) for stable, long-term citation
  • Data archived according to FAIR principles
  • Public metadata at meta.dasch.swiss
  • Data accessible to other researchers at app.dasch.swiss and for machines via an API

Learn more: Permissions System

After Archiving

Archiving marks the beginning of long-term data accessibility:

  • Ongoing editing: Researchers can edit and add data in DSP-APP at any time
  • Citation stability: Permanent identifiers (ARKs) ensure data remains citable
  • Discoverability: Metadata helps other researchers find the work
  • Access flexibility: Web interface or API access according to needs

DaSCH remains available for questions, technical support, and guidance as projects evolve.

Getting Started

Researchers can contact DaSCH to discuss data archiving needs:


Related Resources