Getting Started

1. Is DaSCH suitable for my project?

We are a long-term archive for research data at PhD level and above, specialized in qualitative data from the humanities. Examples of qualitative data that we archive include data from archaeological excavations or critical editions. If you plan to use DaSCH to archive your data, please contact us in the planning phase of your project to see if we are the right partner for you. If DaSCH isn't suitable for your project, we will offer advice about alternatives. You can also use a repository registry service to gain an overview of different repositories, such as re3data.org, fairsharing.org or forschungsdaten.info.

2. Can you help me write a Data Management Plan (DMP)?

Many researchers find it difficult to write a Data Management Plan (DMP), and we recommend that you ask your university's research IT support for help.

If you want to archive your data with DaSCH, please contact us when writing your DMP to ensure that we are the right partner for your kind of data. After assessing your DMP, we will issue a letter confirming that we will archive your data for the long term.

You can find best practices on how to write your DMP in our manual here.

Contact us at: info@dasch.swiss

3. What is research data?

Humanities researchers often state that they do not produce research data. In this case, a simple thought experiment may help to understand the concept of research data: imagine that your computer is stolen the day before you submit your paper to the publisher, and the only back-up that you have is a Word document with the text of your article. All of the information that is now missing is your research data: Excel files, tables, photos, scans, personal notes, etc.

Data Types & Formats

4. What types of data can be archived with DaSCH?

We are specialized in qualitative data from the humanities. We archive structured data represented e.g. in rows and columns of an Excel sheet or any database export, as well as files such as images, audio files, videos, text documents and ZIP files, as long as they are connected to the data points of the structured data.

As we are specialized in archiving rather than presenting data, we cannot host research tools or virtual research environments (VREs) with their functionality and GUI.

Our DSP archiving software allows you to structure your data according to your needs, annotate it and set links between data points. All data in DSP has to conform with a data model that you must create, establishing the inner structure and logic of your data by defining resources (e.g. "book", "author") and properties (e.g. a "book" has a year of publication, a certain number of pages, a cover image, etc.).

5. What file formats can be archived with DaSCH?

We encourage you not to archive your dataset in the form of a single Excel file, but instead to enter the single data points as resources and append properties to them according to your data model to ensure that your data is more easily searchable and reusable. The files listed below can be archived in their current state, while all other file types need to be converted first.

Supported file formats:

Documents & Archives:

  • DOC, DOCX (Word documents)
  • XLS, XLSX (Excel spreadsheets)
  • PPT, PPTX (PowerPoint presentations)
  • PDF (Portable Document Format)
  • ZIP, TAR, GZ, TAR.GZ, TGZ, GZIP, 7Z (Archive formats)

Images:

  • JPG, JPEG, JP2, PNG (Standard image formats)
  • TIF, TIFF (High-quality image formats)

Audio & Video:

  • MP3, WAV (Audio formats)
  • MP4 (Video format)

Data & Text:

  • XML, TXT, CSV, JSON (Structured data formats)

FAIR Principles & Access

6. My funding institution requires me to share my data according to FAIR principles. Is DaSCH the right partner for this?

Yes, we actively advocate for open research data and our philosophy has always been based on the FAIR principles of data being findable, accessible, interoperable and reusable.

Findable: DaSCH is registered with the large repository registries and offers a metadata browser.

Accessible: we follow an open data strategy, whereby all archived datasets are freely accessible by default.

Interoperable: we offer an open source API that implements linked open data standards such as RDF, RDFS, OWL and SHACL. Images are accessible according to the IIIF standard. All source code and documentation is freely accessible on GitHub.

Reusable: we enforce consistency checks and annotation, with every resource receiving a citable persistent ARK identifier.

7. Is my data stored with DaSCH publicly available and citable?

Yes, every project and all resources within it receives an archival resource key (ARK) that can be cited in academic publications, in the form of a URL that remains reachable even if you make changes to the original data. Everyone can see and access your data, unless you restrict its visibility yourself. If there are parts of the data you do not want to be publicly accessible, e.g. because they are unpublished, we can offer an embargo of one to two years, after which everyone can access the data.

8. Can I archive data but restrict its access?

We offer several options for permissions to view and/or edit your data. For example, an image can be hidden, visible or even editable for users who are not logged in. Additionally, access to the data can be restricted for non-project staff.

We also offer an embargo on your entire data set for a maximum of two years, during which your data is only visible to project members. This feature is useful for projects that want to cite their data set but don't want it to be publicly available prior to publication.

9. What value does DaSCH add to my project?

A typical paper or monograph in the Humanities contains only a small part of the data that was assembled during the project phase. All too often, the rest stays on a personal computer for some years before falling into oblivion. But this can be improved: If you decide to archive your entire dataset in a repository like DaSCH, this will

  • increase the visibility of your research.
  • make it easier for other researchers to pick up on your publication, so you will be cited more often.
  • boost Humanities as a whole: Other researchers won't need to waste months collecting data that you have already collected.

Platform Features

10. Are DaSCH's solutions consistent with international standards for metadata?

DaSCH maintains its own metadata browser at meta.dasch.swiss. In order to foster interoperability and reusability it is very important that DaSCH complies with international standards for metadata. Agreed on minimal requirements are mandatory fields in our own metadata schema, but we encourage researchers to also provide optional information.

Internal structure of the datasets

Every data set in the DaSCH Service Platform (DSP) is internally structured by the resource classes of the data model and their properties.

We encourage these classes to be derived from existing gazetteers, controlled vocabularies, ontologies or conceptual schemas such as Dublin Core, schema.org, PROV-O or the CIDOC CRM ontology.

Our database is built on linked open data standards such as RDF, RDFS, OWL and SHACL.

Multimedia files are accessible according to the IIIF standard.

Description of the datasets

Every dataset published in DSP must have a comprehensive project description, namely metadata about the dataset. The minimum requirements in this respect consist of information such as a title, abstract, keywords, discipline(s), the principal investigator, funding institution, the time period covered, how the data was collected, etc.

For this kind of metadata, we have developed our own schema similar to schema.org.

The metadata of ongoing and completed projects can be browsed at meta.dasch.swiss.

11. How can I display my research data on a website?

The data archived with DaSCH can be displayed in the following ways:

You can use our all-purpose DSP-APP frontend, which allows you to access your data via a user-friendly web page, although currently you cannot configure or customize this web page.

Our database has a public API (the DSP-API) that allows you to access the data archived with DaSCH in machine-readable form. This means that you can build and maintain a website with your own financial means and technical know-how, but DaSCH does not recommend that. The public API is kept as stable as possible, but can change over time. It would fall under your responsibility to keep up with the announced changes and to implement any necessary changes in your application.

12. Can I edit my data after archiving?

Yes. In DSP-APP, you can manually view, delete or edit records, although these actions are restricted by the rules of the data model and the data that is already in the database. For example, if you have configured a field as mandatory, you cannot simply delete a value for that field in a record, but you may update the value.

13. How do I access and search datasets archived with DaSCH?

The datasets are listed and described at https://meta.dasch.swiss. If you have found an interesting dataset, follow the link to the data. You can then conduct a simple search via the search bar, or an advanced search that filters out resources and properties that meet the search criteria.

The data is also accessible via DSP-API, which allows computer scientists to collect data in an automated way.

14. How can I reuse datasets from DaSCH for a new purpose?

Thanks to the elaborated search functionality, you can filter existing datasets according to specific criteria, which allows you to find exactly what you need. Searches are reproducible, and every resource has a stable identifier (ARK) that can be cited. All archived files can be downloaded.

Costs & Support

15. Are your services free of charge? How much data can I archive with DaSCH?

DaSCH is financed by the Swiss National Science Foundation (SNSF), which enables us to offer our services to Swiss research projects with very favorable conditions. There are two types of costs that typically arise:

Data preparation: most research projects need help creating a data model, cleaning their data and resolving inconsistencies within their data set. For this, the first eight working hours are free of charge. Afterwards, these services will be charged at a rate of CHF 80 per hour + 7.7% VAT if you are a member of a Swiss Higher Education Institution, otherwise the rate is CHF 160 per hour + 7.7% VAT. If you plan to submit a proposal for a research project at a funding institution, we advise you to apply for funding for preparing the data for long-term archiving.

Hosting costs: Our DSP archiving software is optimized for small datasets up to several hundred gigabytes. The DaSCH long-term archiving service is free of charge for data of up to 500 GB. If the size of your data set exceeds this amount, you may be required to pay annual hosting costs. If you need further information, please contact us at info@dasch.swiss.

16. What support does DaSCH provide during my project?

We are happy to support you concerning the long-term archiving of your data with DaSCH, including when it comes to using our tools and preparing your data for the import into the DaSCH archiving system. We also offer training and workshops for our tools. If you plan to use DaSCH as an archive, you should consult us while writing your DMP. For other IT-related questions, you should contact your institution's research IT support.

Technical Requirements

17. Do I need in-depth IT knowledge to use the DaSCH archiving software (DSP)?

This depends on your project. We have a customer-friendly user interface (DSP-APP) in which you can build a data model and manually add single data points, which simply requires following some instructions and a bit of practice. However, if you have a large dataset that you want to archive at once (mass data import), you either need some IT knowledge in order to convert your data into our DSP-conforming XML format, or we can offer this as a paid service.

In any case, you should take some time to acquaint yourself with some core concepts such as data models, resources and properties – this is even more important for a successful collaboration than IT knowledge in terms of preventing misunderstandings.

18. Can I archive datasets containing sensitive data?

For legal reasons, we don't archive sensitive data. It is your responsibility to desensitize your data, ensuring that it does not allow any conclusions to be made about living persons. Sensitive data includes information on health, privacy, ethnic origin, social welfare needs, religious, ideological or political views, as well as criminal punishments and measures.

19. Why can't I archive my dataset with just a few clicks?

DaSCH is a curated repository for research data from the arts and humanities. If you have just a few files to deposit, please contact us at info@dasch.swiss. After a check whether your data fall within the scope of our repository, we will create a project for you, you may use a simple standard data model, adapt it to your needs if necessary, and proceed on your own.

If you want your data to be directly accessible and visible, we offer a platform that presents your data in a highly structured, richly annotated and consistent manner. In this case it is necessary to create a custom data model that exactly defines the inner structure of your data, described by resource classes (e.g. "book", "author") and properties (e.g. a "book" has a year of publication, a certain number of pages, a cover image, etc.). Afterwards, you either add data manually on our platform - e.g. with 100 books all with a year of publication, etc. - or your data has to be cleaned in order to comply with the data model and to be imported. While this approach involves a lot of work, it makes your data cleaner and more understandable for other users, while significantly increasing its findability when searching.