To bring some fun into the application for user who enter data we are thinking about adding game like achievements.
Also it could be useful to motivate users to explore the application, give feedback and similar.
Here are some ideas, feel free to add your own.
Interesting but has to be updated manually for users in every version of OpenAtlas they are part of.
Tools for anthropological analyses to use directly while working in human remains. Allowing the acquisition of basic data like age, sex, and pathologies to use in an anthropological and archaeological context.
Regarding the model this is connected to burial as a stratigraphic unit (E18 physical object) and human remains (E20 biological object).
For some aspects entering data is already possible but a better user interface is desirable. For other data new ways of entering are required e.g. for sex and age estimation.
For the (graphical) anthropological interface the following bones and bone parts have to be recorded (import ID of bones already entered as types in brackets):
See: Bone inventory
OpenAtlas provides an API which is RESTlike to easily access data entered in OpenAtlas.
A complete overview of possible endpoints and usage is available at our swagger documentation.
Overview of available endpoints: Endpoints
The API can be accessed via the following schema: {domain}/api/{api version}/{endpoint} for example: demo.openatlas.eu/api/0.3/entity/1234
If advanced layout is selected in your profile, a link to the different formats of entities is shown on their specific page.
If the option Public is activated in site settings (default is off) the API can be used even when not logged in.
Overview of all parameters used in different versions: Parameters
Endpoints can take in multiple parameters entered after the URL e.g.: demo-openatlas.eu/api/code/actor?sort=desc&limit=300
The first parameter has to have a ? to mark the parameter sections begin. More parameters can be added with &.
api/code/actor?sort=desc&limit=300
Please visit CORS for more information.
general development
Versioning
The API basically can be accessed through two methods: Either from the user interface of an OpenAtlas application or, if the settings will allow it, from another application.
Please also refer to the SwaggerHub documentation: https://app.swaggerhub.com/apis-docs/ctot-nondef/OpenAtlas/0.2
These endpoints can provide full information about one or more entities. The output format is the Linked Place Format(LPF). Alternativly there is a simple GeoJSON format and multiple RDFs, derived from the LPF, available.
Retrieves a representations of an entity through the ID.
Retrieves a json with a list of entities based on their CIDOC CRM class code. The outpout contains a results and pagination key. All in OpenAtlas available codes can be found under OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas view name. Available categories can be found at OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas system class. Available categories can be found at OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
With the query endpoint, one can combine the three endpoints above in a single query. Each request has to be a new parameter. Possible parameters are:
For more details of the different queries please consult the associated section. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves the latest entries made in the OpenAtlas database. The number represents the amount of entities retrieved. /latest can be any number between and including 1 and 100.
Retrieves a list of entities, which are linked to the entity with the given id. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. A possible id can be obtained for example by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. This also includes all entities, which are connected to an subtype. A possible id can be obtained for example by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a JSON list of entities names, IDs and URLs, based on their OpenAtlas type. Be aware that "Historical Place" and "Administrative Units" cannot be retrieved by this.
Retrieves a JSON list of entities names, IDs and URLs, based on their OpenAtlas type. This also includes all subtypes of the given type id. Be aware that "Historical Place" and "Administrative Units" cannot be retrieved by this.
Retrieves a JSON list of entities names, IDs and URLs, from the first layer of subunits from the given entity ID.
Retrieves a JSON list of entities names, IDs and URLs, of all subunits from the given entity ID.
Retrieves a detailed JSONlist of all OpenAtlas types. This includes also includes a list of childrens of a type
Retrieves a JSON list of all OpenAtlas types sorted by custom, places, standard and value.
Provides a list of all available system classes, their CIDOC CRM mapping, which view they belong, which icon is used and the English name
Retrieves a json of the content (Intro, Legal Notice, Contact and the size for processed images) from the OpenAtlas instance. The language can be chosen with the lang parameter (en or de).
Retrieves a list of all selected geometries in the database in a standard GeoJSON format. This endpoint should be used for map overviews.
Retrieves a list of how many entities, a system class has.
Provides the image of the requested ID. Be aware, the image will only be displayed if:
Not all endpoints support all parameters. Also, some endpoints has additional unique parameter options, which are described at their section.
path\parameter | type_id | format | page | sort | column | limit | filter | first | last | show | count | download | lang | geometry | image_size |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
entity | x | x | x | ||||||||||||
code | x | x | x | x | x | x | x | x | x | x | x | x | |||
system_class | x | x | x | x | x | x | x | x | x | x | x | x | |||
entities_linked_to_entity | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities_all | x | x | x | x | x | x | x | x | x | x | x | x | |||
class | x | x | x | x | x | x | x | x | x | x | x | x | |||
latest | x | x | x | x | x | x | x | x | x | x | x | x | |||
query | x | x | x | x | x | x | x | x | x | x | x | x | |||
node_entities | x | x | |||||||||||||
node_entities_all | x | x | |||||||||||||
subunit | x | x | |||||||||||||
subunit_hierarchy | x | x | |||||||||||||
type_tree | x | x | |||||||||||||
node_overview | x | x | |||||||||||||
geometric_entities | x | x | x | ||||||||||||
content | x | x | |||||||||||||
classes | |||||||||||||||
system_class_count | |||||||||||||||
display | x |
<'asc', 'desc'>
?sort=<'asc','desc'>
If multiple sort parameter are used, the first valid sort input will be used.
It does not matter if the words are uppercase or lowercase (i.e. DeSc or aSC), but the query only takes asc or desc as valid input. If no valid input is provided, the result is orders ASC.
<'id', 'class_code', 'name', 'description', 'created', 'modified', 'system_type', 'begin_from', 'begin_to', 'end_from', 'end_to'>
The column parameter declares which columns in the table are sorted with the sort parameter.
?column=<'id', 'class_code', 'name', 'description', 'created', 'modified', 'system_type', 'begin_from', 'begin_to', 'end_from', 'end_to'>
If multiple column parameter are used, a list is created, by the order in which the parameters are given (i.e. ?column=name&column=description&column=id will order by name, description and id).
It does not matter if the words are uppercase or lowercase (i.e. Name, ID, DeScrIPtioN or Class_Code). If no valid input is provided, the results are ordered by name.
<number>
The limit parameter declares how many results will returned.
?limit=<number>
If multiple limit parameter are used, the first valid limit input will be used. Limit only take positive numbers.
<=, !=, <, <=, >, >=, LIKE, IN, AND, OR, AND NOT, OR NOT>
The filter parameter is used to specify which entries should return.
?filter=<XXX>
Please note, that the filter values will translate directly in SQL. For example:
?filter=and|name|like|Ach&filter=or|id|gt|5432
AND e.name LIKE %%Ach%% OR e.id > 5432
?filter=or|id|gt|150&filter=anot|id|ne|200 ?filter=and|name|like|Ach
first=<id> OR last=<id> OR page=<int>
The page parameter will take any number as page number and provides the entities of this page.
The first parameter takes IDs and will show every entity after and including the named ID.
The last parameter takes IDs and will show every entity after the named ID.
?page=<int>
?first=<id>
?last=<id>
Page, first and last will only take numbers. First and last has to be a valid ID. The table will be sorted AND filtered before the pagination comes in place.
?page=4 ?last=220 ?first=219
<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
The show parameter will take in the key values of a json. If no value is given, every key will be filled. If a value is given, it only will show the types which are committed. If the parameter contains none, no additional keys/values will be shown. This will only work with the Linked Places Format.
?show=<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
For each value, a new parameter has to be set. The value will be matched against a list of keywords, so wrong input will be ignored.
?show=when ?show=types ?show=types&show=when ?show=none
lp, geojson, pretty-xml, n3, turtle, nt, xml
With the format parameter, the output format of an entity representation can be selected. lp stands for Linked Places Format, which is the standard selection. For information on other formats, please confer API Output Formats
?format=<lp, geojson, pretty-xml, n3, turtle, nt, xml>
Only the last format parameter counts as valid input. This parameter is not case-sensitive.
?format=lp ?format=geojson ?show=n3
<int>
The whole search query will be filtered by this Type ID. Multiple type_id parameters are valid and are connected with a logical OR connection.
?type_id=<id>
type_id only takes a valid type ID.
?type_id=<int>
<>
Returns a json with a number of the total count of the included entities.
?count
Only count will trigger the function. Count can have any numbers assigned to it, which makes no difference.
?count
<>
Will trigger the download of the result of the request path.
?download
Only download will trigger the function. Download can have anything assigned to it, but this will be discarded.
?download
<'en', 'de'>
Select the language, which content will be displayed.
?lang
Default value is None, which means the default language of the OpenAtlas instance is taken.
?lang ?lang=en ?lang=DE
gisAll, gisPointAll, gisPointSupers, gisPointSubs, gisPointSibling, gisLineAll, gisPolygonAll
Filter, which geometric entities will be retrieved through /geometric_entities. Multiple geometry parameters are valid. Be aware, this parameter is case-sensitive!
?geometry
The default value is gisAll. Be aware, this parameter is case-sensitive!
?geometry=gisPointSupers ?geometry=gisPolygonAll
The API basically can be accessed through two methods: Either from the user interface of an OpenAtlas application or, if the settings will allow it, from another application.
These endpoints can provide full information about one or more entities. The output format is the Linked Place Format(LPF). Alternatively, there is a simple GeoJSON format and multiple RDFs, derived from the LPF, available.
Retrieves a representation of an entity through the ID.
Retrieves a json with a list of entities based on their CIDOC CRM class code. The output contains a results and pagination key. All in OpenAtlas available codes can be found under OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically, and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas view name. Available categories can be found at OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas system class. Available categories can be found at OpenAtlas and CIDOC CRM. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically and 20 entities are shown.
With the query endpoint, one can combine the three endpoints above in a single query. Each request has to be a new parameter. Possible parameters are:
For more details of the different queries, please consult the associated section. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically, and 20 entities are shown.
Retrieves the latest entries made in the OpenAtlas database. The number represents the amount of entities retrieved. /latest can be any number between and including 1 and 100.
Retrieves a list of entities, which are linked to the entity with the given id. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. A possible id can be obtained, for example, by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. This also includes all entities, which are connected to an subtype. A possible id can be obtained, for example by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default, results are ordered alphabetically, and 20 entities are shown.
Retrieves a detailed JSON list of all OpenAtlas types. This includes also includes a list of children of a type
Retrieves a JSON list of all OpenAtlas types sorted by custom, places, standard and value.
Takes only a valid place (E18) ID. Retrieves a list of the given place and all of its subunits. This endpoint provides a special Thanados Format. With the format=xml parameter, an XML can be created,
Provides a list of all available system classes, their CIDOC CRM mapping, which view they belong, which icon is used and the English name
Retrieves a json of the content (Intro, Legal Notice, Contact and the size for processed images) from the OpenAtlas instance. The language can be chosen with the lang parameter (en or de).
Retrieves a list of all selected geometries in the database in a standard GeoJSON format. This endpoint should be used for map overviews.
Retrieves a list of how many entities, a system class has.
Provides the image of the requested ID. Be aware, the image will only be displayed if:
Not all endpoints support all parameters. Also, some endpoints has additional unique parameter options, which are described at their section.
path\parameter | type_id | format | page | sort | column | limit | search | first | last | show | count | download | lang | geometry | image_size | export |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
entity | x | x | x | x | ||||||||||||
code | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
system_class | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
entities_linked_to_entity | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities_all | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
class | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
latest | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
query | x | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities | x | x | x | |||||||||||||
type_entities_all | x | x | x | |||||||||||||
subunit | x | x | ||||||||||||||
subunit_hierarchy | x | x | ||||||||||||||
type_tree | x | x | ||||||||||||||
node_overview | x | x | ||||||||||||||
geometric_entities | x | x | x | |||||||||||||
content | x | x | ||||||||||||||
classes | ||||||||||||||||
system_class_count | ||||||||||||||||
display | x |
<'asc', 'desc'>
?sort=<'asc','desc'>
If multiple sort parameter are used, the first valid sort input will be used.
It does not matter if the words are uppercase or lowercase (i.e. DeSc or aSC), but the query only takes asc or desc as valid input. If no valid input is provided, the result is orders ASC.
<'id', 'name', 'cidoc_class', 'system_class'>
The column parameter declares which columns in the table are sorted with the sort parameter.
?column=<'id', 'name', 'cidoc_class', 'system_class'>
If multiple column parameter are used, a list is created, by the order in which the parameters are given (i.e. ?column=name&column=id will order by name and then by id).
It does not matter if the words are uppercase or lowercase (i.e. Name, ID, DeScrIPtioN or Class_Code). If no valid input is provided, the results are ordered by name.
<number>
The limit parameter declares how many results will return. limit=0 will return all entities.
?limit=<number>
If multiple limit parameter are used, the first valid limit input will be used. Limit only take positive numbers.
The search parameter provides a tool to filter and search the data with logical operators.
Search parameter:
?search={}
Logical operators:
These are not mandatory. _or _ is the standard value.
and, or
Compare operators:
equal, notEqual, like(1), greaterThan(2), greaterThanEqual(2), lesserThan(2), lesserThanEqual(2) (1) Only string categories (2) Only beginFrom, beginTo, endFrom, endTo, valueTypeID
Filterable categories:
entityName, entityDescription, entityAliases, entityCidocClass, entitySystemClass, entityID, typeID, valueTypeID, typeIDWithSubs, typeName, beginFrom, beginTo, endFrom, endTo, relationToID
The search parameter takes a JSON as value. A key has to be a filterable category followed by a list/array. This list need to have again JSON values as items. There can be multiple search parameters. E.g:
?search={"typeID":[{"operator":"equal","values":[123456]}], "typeName":[{"operator":"like","values":["Chain", "Bracelet", "Amule"],"logicalOperator":"and"}]}&search={"typeName":[{"operator":"equal","values":["Gold"]}], "beginFrom":[{"operator":"lesserThan","values":["0850-05-12"],"logicalOperator":"and"}]}
Every JSON in a search parameter field is logical connected with AND. E.g:
?search={A:[{X}, {Y}], B: [M]} => Entities containing A(X and Y) and B(M)
Each search parameter is logical connected with OR. E.g:
?search={A:[{X}, {Y}]}&search={A:[{M}]} => Entities containing A(X and Y) or A(M)
Within the list of a key, there are multiple queries possible. A query contains a compare operator, the values to be searched and a logical operator, how the values should be handled. E.g:
{"operator":"equal","values":[123456],"logicalOperator":"or"} {"operator":"notEqual","values":["string", "otherString"],"logicalOperator":"and"} {"operator":"lesserThan","values":["0850-05-12"],"logicalOperator":"and"} {"operator":"like","values":["Gol", "Amul"],"logicalOperator":"and"}
Values has to be a list of items. The items can be either a string, an integer or a tuple (see Note). Strings need to be marked with "" or '', while integers doesn't allow this.
Note: the category valueTypeID can search for values of a type ID. But it takes one or more two valued Tuple as list entry: (x,y). x is the type id and y is the searched value. This can be an int or a float. E.g:
{"operator":"lesserThan","values":[(3142,543.3)],"logicalOperator":"and"}
The compare operators work like the mathematical operators. equal x=y, notEqual x!=y, greaterThan x>y , greaterThanEqual x>=y, lesserThan x<y, lesserThanEqual x<=y. The like operator searches for occurrence of the string, so a match can also occur in the middle of a word.
With the example above, we can textualize the outcome:
?search={"typeID":[{"operator":"equal","values":[123456],"logicalOperator":"or"}, "typeName":[{"operator":"notEqual","values":["Chain", "Burial object"],"logicalOperator":"and"]}&search={"typeName":[{"operator":"like","values":["Gol"],"logicalOperator":"or"}]}
Get entities which has the typeID 123456 AND NOT the types called "Chain" AND "Burial object", OR all entities which has the typeName with "Gold" in the type name.
first=<id> OR last=<id> OR page=<int>
The page parameter will take any number as page number and provides the entities of this page.
The first parameter takes IDs and will show every entity after and including the named ID.
The last parameter takes IDs and will show every entity after the named ID.
?page=<int>
?first=<id>
?last=<id>
Page, first and last will only take numbers. First and last has to be a valid ID. The table will be sorted AND filtered before the pagination comes in place.
?page=4 ?last=220 ?first=219
<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
The show parameter will take in the key values of a json. If no value is given, every key will be filled. If a value is given, it only will show the types which are committed. If the parameter contains none, no additional keys/values will be shown. This will only work with the Linked Places Format.
?show=<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
For each value, a new parameter has to be set. The value will be matched against a list of keywords, so wrong input will be ignored.
?show=when ?show=types ?show=types&show=when ?show=none
<'P2', 'P67', 'P53', 'OA7', ...>
The relation_type parameter will take the property code from CIDOC CRM (P) and the OpenAtlas codes (OA) as values. The relation json field from the Linked Places Format will only show relations with the given code. This can significantly decrease the payload.
For each value, a new parameter has to be set. The value will be matched against the list of possible property codes. Wrong input will be ignored.
?relation_type=P2 ?relation_type=P67 ?relation_type=2&relation_type=OA7
lp, geojson, pretty-xml, n3, turtle, nt, xml
With the format parameter, the output format of an entity representation can be selected. lp stands for Linked Places Format, which is the standard selection. For information on other formats, please confer API Output Formats
?format=<lp, geojson, pretty-xml, n3, turtle, nt, xml>
Only the last format parameter counts as valid input. This parameter is not case-sensitive.
?format=lp ?format=geojson ?show=n3
<int>
The whole search query will be filtered by this Type ID. Multiple type_id parameters are valid and are connected with a logical OR connection.
?type_id=<id>
type_id only takes a valid type ID.
?type_id=<int>
<>
Returns a json with a number of the total count of the included entities.
?count
Only count will trigger the function. Count can have any numbers assigned to it, which makes no difference.
?count
<>
Will trigger the download of the result of the request path.
?download
Only download will trigger the function. Download can have anything assigned to it, but this will be discarded.
?download
<'en', 'de'>
Select the language, which content will be displayed.
?lang
Default value is None, which means the default language of the OpenAtlas instance is taken.
?lang ?lang=en ?lang=DE
gisAll, gisPointAll, gisPointSupers, gisPointSubs, gisPointSibling, gisLineAll, gisPolygonAll
Filter, which geometric entities will be retrieved through /geometric_entities. Multiple geometry parameters are valid. Be aware, this parameter is case-sensitive!
?geometry
The default value is gisAll. Be aware, this parameter is case-sensitive!
?geometry=gisPointSupers ?geometry=gisPolygonAll
csv, csvNetwork
Export the result in the given format as download. csv is one single CSV file of the result. csvNetwork are multiple CSV files, especially used for network analysis. So there are for each system class a CSV, describing the entities, a link.csv where the links between the entities are shown and a geometry.csv containing the geometries.
?export
?export=csv ?export=csvNetwork
/api/0.3/query will be deleted, the idea is, that all endpoints behind the resource entities can be stacked behind with additional parameter, e.g.:
/api/entities/cidoc_code/E18?sort=desc/type/23/entity/512
If this is not possible in any form, query would come back.
This page is a discussion base and documentation about the authentication system of the API.
To be filled.... (What is Token-Based and CORS, advantages and disadvantages, usage, flask compatibly?)
This is information is currently deprecated but planned to be updated.
For the API we want to create a more detailed error model. The error message should be machine- and human-readable, therefore we use json format as response language.
One example could be:
{ title: "Forbidden", status: 403, detail: "You don't have the permission to access the requested resource. Please authenticate with the server, either through login via the user interface or token based authentication.", timestamp: "Fri, 29 May 2020 10:13:22 GMT", }
As a RFC draft for API error messages states (https://tools.ietf.org/html/draft-nottingham-http-problem-07):
Following error codes will be caught by the API error handler:
Code | Description | Detail | Error Message |
---|---|---|---|
400 | Bad Request | Client sent an invalid request — such as lacking required request body or parameter | The request is invalid. The body or parameters are wrong. |
401 | Unauthorized | Client failed to authenticate with the server | You failed to authenticate with the server. |
403 | Forbidden | Client authenticated but does not have permission to access the requested resource | You don't have the permission to access the requested resource. Please authenticate with the server, either through login via the user interface or token based authentication. |
404 | Not Found | The requested resource does not exist | Something went wrong! Maybe only digits are allowed. Please check the URL. |
404a | Not Found | The requested resource does not exist | The requested entity doesn't exist. Try another ID. |
404b | Not Found | The requested resource does not exist | The syntax is incorrect. Only digits are allowed. For further usage, please confer the help page. |
404c | Not Found | The requested resource does not exist | The syntax is incorrect. Valid codes are: actor, event, place, source, reference and object. For further usage, please confer the help page. |
404d | Not Found | The requested resource does not exist | The syntax is incorrect. These class code is not supported. For the classes please confer the model. |
404e | Not Found | The requested resource does not exist | The syntax is incorrect. Only integers between 1 and 100 are allowed. |
404f | Not Found | The requested resource does not exist | The syntax is incorrect. Only valid operators are allowed. |
405 | Invalid Method | The method is not available | The method used is not supported. Right now only GET is allowed. |
Participants: Bernhard Koschicek, Alexander Watzinger, Nina Brundke, Christoph Hoffmann, Stefan Eichert and special guest Smilla
Location: ACDH-CH, Alte Burse, Sonnenfelsgasse 19, 1010 Vienna
Time: 2020-08-09, 14:00
After we discussed the excellent presented and extensive recent development we decided to switch to a more practical approach. Christoph and Stefan will write issues for the API to satisfy their needs for presentation sites so that we can see and test the API in action.
The ARCHE import is deactivated by default. To enable it, add the following dictionary to instance/production.py:
ARCHE = { 'id': 1390136, # ID of the Top Collection (acdh:TopCollection) 'collection_ids': [1390141], # ID of different collections containing metadata.json files (acdh:Collection) 'base_url': 'https://arche-curation.acdh-dev.oeaw.ac.at/', # Base URL to get data from 'thumbnail_url': 'https://arche-thumbnails.acdh.oeaw.ac.at/' # URL of ARCHE thumbnail service, no changes needed}
If the feature is enabled, every user can see an additional button in Admin -> Data, called ARCHE, which lead to an information page of data provided at instance/production.py.
If the user belongs to the manager user group, a button called Fetch is displayed. Pressing Fetch will fetch data from ARCHE. This will check if the data has already been imported into OpenAtlas (based on the artifact). If an entry is not present, a summary table is displayed with the graffiti that will be imported.
By pressing the button Import ARCHE data the data will be imported and if necessary new types, persons, etc. will be created.
All data is gathered from [IMAGE_NAME]_metadata.json:
'image_id': image_id (ARCHE) 'image_link': image_url (ARCHE) 'image_link_thumbnail': thumbnail_url (ARCHE) 'creator': EXIF:Artist 'latitude': EXIF:GPSLatitude 'longitude': EXIF:GPSLongitude 'description': XMP:Description 'name': IPTC:ObjectName 'license': EXIF:Copyright 'date': EXIF:CreateDate
Data provided from production.py:
ARCHE = { 'id': 1390136, 'collection_ids': [1390141], 'base_url': 'https://arche-curation.acdh-dev.oeaw.ac.at/', 'thumbnail_url': 'https://arche-thumbnails.acdh.oeaw.ac.at/'}
Documents archived for historical reasons.
Linked with: P117 occurs during
Domain: E5, E6, E8, E12
Range: E5, E6, E8, E12
Linked with: P107 has current or former member
Domain: E74, E40
Range: E21, E74, E40
Entries with * are obligatory and cannot be altered. Others are facultative examples and are editable/extendable
Types E55 are linked with E55 via P127 (has broader term)
Definitions of an actor's function within a group or legal body. An actor can for example be member of a legal body and this membership is defined by a certain function/role during a certain period of time. E.g. Actor "Charlemagne" is member of the legal body "Frankish Reign" from 768-814 in the function of "King" and he is member of the legal body "Roman Empire" from 800 to 814 in the function "Emperor".
Bishop Abbot Pope King Emperor Count Duke
Types for Sources
Charter Testament Letter Contract
Definitions for the type of bibliographic items like "articles", "books", "proceedings" etc.
Article Book Inbook Mastersthesis Phdthesis unpublished
Definitions for the type of Source Editions like "Charter Edition" etc.
Charter Edition Letter Edition Chronicle Edition ...
Source Content* Source Original Text* Source Translation* Comment*
Categories for the types of information carriers. E.g. to determine if the physical manifestation of a medieval charter is an original, a later copy etc.
Original Document Copy of Document
_Definitions of relationships between one actor and another. These relationships can be directional (e.g. parent of/child of: actor A is the father of actor B while actor B is the son of actor A) or be the same in both directions (e.g. friend of: actor A is friend of actor B and vice versa actor B is friend of actor A) _
Kindredship Parent of (Child of) Social Friend of Enemy of Mentor of (Student of) Political Ally of Leader of (Retinue of) Economical Provider of (Customer of)
Exact date value From date value To date value
Exact position Near to a known position Within a known area
Categories for persons' gender
Female Male
Types for Places sites-example
Boundary Mark Burial Site Economic Site Infrastructure Military Facility Ritual Site Settlement Topographical Entity
Types of events
Change of Property Donation Sale Exchange Conflict Battle Raid
Categories to define the involvement of an actor within an event. E.g. "Napoleon" participated in the event "Invasion of Russia" as "Commander"
Creator Sponsor Victim Offender
E53 linked via P89 "falls within/contains" to E53 - e.g. Austria contains Wien/Wien falls within Austria
Administrative Units Austria Wien Kärnten Niederösterreich Oberösterreich Salzburg Tirol Steiermark Vorarlberg Burgenland Germany Italy Czech Republic Slovakia Slovenia
E55 entities (but listed here as they belong to places)
used to determine the type of place e.g. country, province, district... Uses national terms due to each country's different systems
Country Bundesland Bezirk Gemeinde Katastralgemeinde Regierungsbezirk Landkreis Gemarkung Regioni Comune Province
E53 entities. These Places are for example historical regions like the "Medieval Kingdom of Serbia" or the early medieval "Duchy of Bavaria".
Historical Places Carantania Marcha Orientalis Comitatus Iauntal Kingdom of Serbia
Numbers in brackets are type ids from the THANADOS project.
Image API Server is responsible to deliver the images.
Since OpenAtlas relies on Debian packages, we recommend to use IIPImage as IIIF Image API. But any other working IIIF Image API Server can be used, if it can handle Tiled Multi-Resolution TIFF and uses a folder to handle the images.
For installation of the IIPImage server see install notes of OpenAtlas.
java -version
-bash: java: command not found
sudo apt install default-jre
cd /var/www
wget https://github.com/cantaloupe-project/cantaloupe/releases/download/v5.0.5/cantaloupe-5.0.5.zip
7z x cantaloupe-5.0.5.zip
mv cantaloupe-5.0.5 cantaloupe
cd cantaloupe
cp cantaloupe.properties.sample cantaloupe.properties
vim cantaloupe.properties
FilesystemSource.BasicLookupStrategy.path_prefix = /var/www/iiif/
# Enables the Control Panel, at /admin.
endpoint.admin.enabled = true
endpoint.admin.username = admin
endpoint.admin.secret = password
sudo a2enmod headers
sudo a2enmod proxy_http
sudo vim /etc/apache2/sites-available/cantaloupe.conf
<VirtualHost *:80>
# X-Forwarded-Host will be set automatically by the web server.
RequestHeader set X-Forwarded-Proto "https"
RequestHeader set X-Forwarded-Port "80"
RequestHeader set X-Forwarded-Path /
ServerName apache-server
AllowEncodedSlashes NoDecode
ErrorLog /var/log/apache2/cantaloupe_error.log
CustomLog /var/log/apache2/cantaloupe_access.log combined
ProxyPass / http:// YOUR-DOMAIN.at:8182/ nocanon
ProxyPassReverse / http://YOUR-DOMAIN.at:8182/
ProxyPassReverseCookieDomain YOUR-DOMAIN.at apache-server
ProxyPreserveHost on
</VirtualHost>
sudo a2ensite cantaloupe.conf
sudo service apache2 restart
sudo mkdir /etc/cantaloupe/ openssl pkcs12 -export -out /etc/cantaloupe/ssl-certificate.pfx -inkey /etc/letsencrypt/live/YOUR-DOMAIN.at/privkey.pem -in /etc/letsencrypt/live/YOUR-DOMAIN.at/cert.pem -certfile /etc/letsencrypt/live/YOUR-DOMAIN.at/fullchain.pem sudo chown bkoschicek:www-data /etc/cantaloupe/ssl-certificate.pfx
# !! Configures the HTTPS server. (Standalone mode only.)
https.enabled = true
https.host = 0.0.0.0
https.port = 8183
# !! Available values are `JKS` and `PKCS12`. (Standalone mode only.)
https.key_store_type = PKCS12
https.key_store_password = PASSWORD
https.key_store_path = /etc/cantaloupe/ssl-certificate.pfx
https.key_password = PASSWORD
sudo vim /etc/systemd/system/cantaloupe.service
[Unit] Description=Cantaloupe Image Server After=network.target [Service] ExecStart=/usr/bin/java -Dcantaloupe.config=/var/www/cantaloupe/cantaloupe.properties -Xmx2g -jar /var/www/cantaloupe/cantaloupe-5.0.5.jar Restart=on-failure User=root WorkingDirectory=/var/www/cantaloupe/ [Install] WantedBy=multi-user.target
sudo systemctl daemon-reload
sudo systemctl enable cantaloupe.service
sudo systemctl start cantaloupe.service
Since IIPServer is not so easy to install on Windows, and we don't really need it for development, I suggest to work with Cantaloupe to get an on-the-fly IIIF server.
scoop install main/libvips
FilesystemSource.BasicLookupStrategy.path_prefix = C:\Users\bkoschicek\PycharmProjects\iiif\
java -Dcantaloupe.config=C:/cantaloupe-5.0.5/cantaloupe.properties -Xmx2g -jar cantaloupe-5.0.5.jar
http://localhost:8182/iiif/2/image.jpg/info.json
http://localhost:8182/iiif/2/image.jpg/full/full/0/default.jpg
OpenAtlas uses several shortcuts in order to simplify connections between entities that are always used the same way.
These shortcuts are named OA + the respective number. Currently OpenAtlas uses 3 shortcuts:
E39 (Actor) - P11i (participated in) - E5 (Event) - P11 (had participant) - E39 (Actor)
The connecting event is defined by an entity of class E55 (Type):
[Relationship from Stefan to Joachim (E5)] has type [Son to Father (E55)]
E77 (Persistent Item) - P92i (was brought into existence by) - E63 (Beginning of Existence) - P7 (took place at) - E53 (Place)
E77 (Persistent Item) - P93i (was taken out of existence by) - E64 (End of Existence) - P7 (took place at) - E53 (Place)
In the following section some mapping examples are discussed. Please be aware of the fact, that they may vary from the current development version of OpenAtlas.
In this section various data-mappings, as used in OpenAtlas, are presented. They use Classes and Properites from the CIDOC CRM (http://www.cidoc-crm.org/) to map the information. If custom properties or shortcuts are used they are named "OA" with a certain number. They are described in detail in the Custom Properties and Shortcuts section. Within OpenAtlas it is also possible to specify various additional attributes a property can have. E.g. the timespan in which this property links two entities or for example the role an actor has within a certain group.
Every entity recorded in OpenAtlas can have a main name respectively a primary identifier, that is stored via P1 ("is identified by" or sub-property) from the entity to E41 ("Appelation" or sub-class). This name or identifier can consist of one ore more words and can contain - e.g. in case of an actor - a first name, second name and surname as well as additional informations like titles: "Sir Elton Hercules John".
If an entity has alternative names like stagenames or pseudonyms they are also documented using a P1 (is identified by) linked to E41 (Appelation): "Reginald Kenneth Dwight" (=Birthname of Elton John). Another example would be "Charlemagne", "Carolus Magnus" and "Karl der Große" that are recorded with the described properties.
Alternative forms of names are mapped with P139: Two E42 identifier entities linked via P139 (has alternative form).
Written sources like for example medieval charters are mapped in OpenAtlas as a combination of several entities. The core is the content of the source that is defined as E33 (linguistic object).
This core usually contains a summary or the whole text of the source in a language understandable for the current users of the database. This core can be linked to translations respectively to the text in the original language (e.g. Latin) that is also stored as E33 (linguistic object).
The core can also be linked to a physical object (E84 Information Carrier) like the original charter (e.g. a parchment manuscript) that carries the information described in the core.
The core content of course can also be documented in other documents (E31) like for example various editions of charters or also secondary source.
This core of the source, respectively the source's content delivers information on various other things like the events, persons and physical things mentioned in the text. These relations are recorded with a P70b (is documented in) link from the mentioned entity to the source.
Information on the environment in which the source originally was generated or on the current or former owner is connected to the physical object/information carrier. This includes for example the legal body or the actor for whom a charter originally was signed or the archive where it is or was stored. Also time and place of creation, as well as the creator and the context of the creation can be stored.
The actor for whom the document was originally set up is recorded by a link P105 (right held by) from the information carrier (E84) to the actor.
The current or former owner of the document, e.g. the archive where a charter is currently kept, is recorded by a link P46 (has former or current owner) from the information carrier (E84) to the actor,
The production of the charter is recorded by an E12 (Production) event that is linked to the charter (E84 information carrier) with a P108 (was produced by) link.
This E12 Prodcution can be linked to places (P7 took place at) and to a certain point in time (OA5 and OA6 begins/ends at).
It can also be linked to other temporal entities, like a superior event during which the charter was produced (e.g. a Hoftag/Assembly) or to a certain chronological period/timespan via a P117 (occurs during) link.
The creator/writer (E39 Actor) of the charter is linked to the production event via a P104 (performs) property.
Within OpenAtlas single Persons (E21), Groups (E74) – like families – and legal bodies (E40) like for example the Holy Roman Empire are dealt with.
They can be linked to other entites like actors, events, documents, physical things, places, date and time etc.
Actors can participate in Events either actively or passively. The first case is mapped with P14 performed, the latter with P11 participated in. In case of changes of property (i.e. if the event is a E8 Acquisition) P23 surrendered title through - for the giver - and P22 acquired title through -for the receiver - are used.
Also the actor's role can be documented by linking the property with a E55 type entity: For example to map an actor's role as sponsor and another one's role as artist/creator during the creation of a physical man made thing.
Actors (like any other entities) can be linked to E73 Information Objects like E31 Document e.g. an article on a historical person or E33 linguistic objects like the text of a medieval charter in which a person is mentionend.
Actors can be part of or have a certain role within a E74 group or a E40 legal body. The specification of this "membership" is recorded with a link to a E55 type
Actors can have direct relationships to other actors. Such relations are mapped with OA7 has relationship to. The type of relationship is specified with a link to E55 type.
Such relationships can be the same in both directions (E.g. Person A is friend of Person B and at the same time Person B is friend of Person A) or have an opposite meaning in the opposite direction (E.g. Person A is the father of person B while Person B is the son of Person A).
Actors can have direct relationships to Physical Things. In most cases this regards property which means that for example a person is the owner of a thing, like e.g. a manor. Such relations are mapped with P51i is former of current owner of.
Actors can have various direct relations to places. Next to them they may participate at an event that takes place a a certain location (see: Actors and Events
Actors can have a certain prefered place like a residence, a headquarter etc. E.g. Salzburg (E53 place) is the headquarter of the Salzburg Bishopric (E40 legal body) Such relations are mapped with P74 has current or former residence.
Actors can have a certain place in which they are born (or appear for the firs time) as well as a place of death (or a place where they appear for the last time). Such relations are mapped with OA8 appears for the first time in and OA9 appears for the last time in.
Actors can have a date at which they were born (or appear for the first time) or die (respectively appear for the last time). Such relations are mapped with OA1 begins chronologically and OA2 ends chronologically (to mark the timespan or date in/on which they appear for the first time) or OA3 born_chronologically and OA4 dies_chronologically (to mark a date on which they where born or died - if known). They link an actor with a time primitive like a timestamp.
In case this date is not known exactly, two time primitives can be recorded to mark a certain temporal span in which the birth or the first resp. death or last appearance took place. The first timestamp therefore is connected (p2 has type) with a type (E52) "from value", the second with a "to value" type (=Subtypes of "Numeric Value Types"). If one exact date is known this one gets the type "exact value".
These shortcuts are used in OpenAtlas to link various entities for certain purposes:
OA1 is used to link the beginning of a persistent item's (E77) life span (or time of usage) with a certain date in time.
E77 Persistent Item linked with a E61 Time Primitive:E77 (Persistent Item) - P92i (was brought into existence by) - E63 (Beginning of Existence) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Holy Lance (E22)] was brought into existence by [forging of Holy Lance (E12)] has time span [Moment/Duration of Forging of Holy Lance (E52)] ongoing througout [0770-12-24 (E61)]
OA2 is used to link the end of a persistent item's (E77) life span (or time of usage) with a certain date in time.
E77 Persistent Item linked with a E61 Time Primitive:E77 (Persistent Item) - P93i (was taken out of existence by) - E64 (End of Existence) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [The one ring (E22)] was destroyed by [Destruction of the one ring (E12)] has time span [Moment of throwing it down the lava (E52)] ongoing througout [3019-03-25 (E61)]
OA3 is used to link the birth of a person with a certain date in time.
E21 Person's Birth linked with a E61 Time Primitive:E21 (Person) - P98i (was born) by - E67 (Birth) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Stefan (E21)] was born by [birth of Stefan (E12)] has time span [Moment/Duration of Stefan's birth (E52)] ongoing througout [1981-11-23 (E61)]
OA4 is used to link the death of a person with a certain date in time.
E21 Person's Death linked with a E61 Time Primitive:E21 (Person) - P100i (died in) - E69 (Death) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Lady Diana (E21)] died in [death of Diana (E69)] has time span [Moment/Duration of Diana's death (E52)] ongoing througout [1997-08-31 (E61)]
OA5 is used to link the beginning of a temporal entity (E2) with a certain date in time. It can also be used to determine the beginning of a property's duration.
E2 Temporal Entity linked with a E61 Time Primitive:E2 (Temporal Entity) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [ Thirty Years' War (E7)] has time span [Moment/Duration of Beginning of Thirty Years' War (E52)] ongoing througout [1618-05-23 (E61)]
OA6 is used to link the end of a temporal entity's (E2) with a certain date in time. It can also be used to determine the end of a property's duration.
E2 temporal entity linked with a E61 Time Primitive:E2 (temporal entity) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [ Thirty Years' War (E7)] has time span [Moment/Duration of End of Thirty Years' War (E52)] ongoing througout [1648-10-24 (E61)]
OA7 is used to link two Actors (E39) via a certain relationship
E39 Actor linked with E39 ActorE39 (Actor) - P11i (participated in) - E5 (Event) - P11 (had participant) - E39 (Actor)
Example: [ Stefan (E21)] participated in [ Relationship from Stefan to Joachim (E5)] had participant [Joachim (E21)]
The connecting event is defined by an entity of class E55 (Type):
[Relationship from Stefan to Joachim (E5)] has type [Son to Father (E55)]
OA8 is used to link the beginning of a persistent item's (E77) life span (or time of usage) with a certain place. E.g to document the birthplace of a person.
E77 Persistent Item linked with a E53 Place:E77 (Persistent Item) - P92i (was brought into existence by) - E63 (Beginning of Existence) - P7 (took place at) - E53 (Place)
Example: [Albert Einstein (E21)] was brought into existence by [Birth of Albert Einstein (E12)] took place at [Ulm (E53)]
OA9 is used to link the end of a persistent item's (E77) life span (or time of usage) with a certain place. E.g to document a person's place of death.
E77 Persistent Item linked with a E53 Place:E77 (Persistent Item) - P93i (was taken out of existence by) - E64 (End of Existence) - P7 (took place at) - E53 (Place)
Example: [Albert Einstein (E21)] was taken out of by [Death of Albert Einstein (E12)] took place at [Princeton (E53)]
E77 (Persistent Item) - P8 (witnessed) - E5 (its own Creation or Modification) - P117 (occurs during) - E4 (period)
Example: [Church of Notre Dame (E18)] belongs stylistically to [Gothic Period (E4)]
E77 (Persistent Item) - P93i (was taken out of existence by) - E64 (end of Existence) - P4 (has time span) - E52 (Time Span)
Example: [The one ring (E22)] was destroyed by [Destruction of the one ring (E12)] has time span [Late Third Age (E52)]
E77 (Persistent Item) - P92i (was brought into existence by) - E63 (Beginning of Existence) - P4 - E52 (Timespan)
Example: [Holy Lance (E22)] was brought into existence by [forging of Holy Lance (E12)] has time span [Carolingian Period (E52)]
In our model it is possible to link from E1 with P2 to E59
This draft is about best practice for development not already covered with the use of PEP 8, Pylint and Mypy.
Code review (Wikipedia) is an essential tool when developing quality software. We welcome interested persons and provide some general information about it here.
The code is available at GitHub and you'll find links to demo versions and manual, used technologies and much more on the OpenAtlas website.
Information about development is available here, in our Redmine Wiki.
We recommend using your favorite editor to analyze the code and install OpenAtlas locally to e.g. run tests.
A good read about working on code together is The Ten Commandments of Egoless Programming, as originally established in Jerry Weinberg's book.
These documents were written in 2015 and may be quite outdated. They are kept for historical reasons.
Mappings - How OpenAtlas maps its information within the CIDOC CRM (Examples)
Graphical User Interface - Concept, Fields, Forms and Content of the GUI
Localisation - Spatial position of physical things
Except where otherwise noted, content on this site is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License
CIDOC - http://www.cidoc-crm.org
CIDOC Eigenschaften - http://cidoc-crm.gnm.de/wiki/Eigenschaften
DPP Map Viewer (experimental): http://dev.geo.univie.ac.at/projects/dpp/#activetypes=¢er=41.226183305514596%2C22.851562500000004&selection=&selectioncategory=&time=300%2C1500&zoom=8
SELECT move.id, from_link.range_id, to_link.range_id FROM model.entity move
LEFT JOIN model.link from_link ON move.id = from_link.domain_id AND from_link.property_code = 'P27'
LEFT JOIN model.link to_link ON move.id = to_link.domain_id AND to_link.property_code = 'P26'
WHERE move.openatlas_class_name = 'move' AND from_link.range_id IS NULL AND to_link.range_id IS NOT NULL
ORDER BY move.id;
UPDATE model.entity SET class_code = 'E9', system_class = 'move' WHERE id IN ( SELECT id FROM model.entity WHERE id in (7511, 8215, 8420, 8422, 8770) AND class_code = 'E7');
UPDATE model.entity SET class_code = 'E9', system_class = 'move' WHERE id IN ( SELECT id FROM model.entity WHERE id in (1124, 1418, 950, 1409) AND class_code = 'E7');
Transform all events with type letter exchange to move event:
UPDATE model.entity SET class_code = 'E9' WHERE id IN (
SELECT e.id FROM model.entity e
JOIN model.link l ON e.id = l.domain_id AND l.range_id = 639 AND e.class_code = 'E7');
Update all move locations to start locations:
UPDATE model.link SET property_code = 'P27' WHERE id IN (
SELECT l.id FROM model.link l
JOIN model.entity e ON l.domain_id = e.id AND l.property_code = 'P7' AND e.class_code = 'E9');
Remove actors from move events and add them to source
BEGIN;
UPDATE model.entity SET class_code = 'E9' WHERE id IN (
SELECT e.id FROM model.entity e
JOIN model.link l ON e.id = l.domain_id AND l.range_id = 939 AND e.class_code = 'E7');
UPDATE model.link SET property_code = 'P27' WHERE id IN (
SELECT l.id FROM model.link l
JOIN model.entity e ON l.domain_id = e.id AND l.property_code = 'P7' AND e.class_code = 'E9');
INSERT INTO model.link (domain_id, range_id, property_code)
SELECT el.domain_id, l.range_id, 'P67' FROM model.link l
JOIN model.entity e ON l.domain_id = e.id AND l.type_id IN (862, 1091, 943, 1046, 1045)
JOIN model.link el ON e.id = el.range_id AND el.property_code = 'P67'
JOIN model.entity s ON el.domain_id = s.id AND s.class_code = 'E33';
DELETE FROM model.link WHERE id in (
SELECT l.id FROM model.link l
JOIN model.entity e ON l.domain_id = e.id AND l.type_id IN (862, 1091, 943, 1046, 1045)
JOIN model.link el ON e.id = el.range_id AND el.property_code = 'P67'
JOIN model.entity s ON el.domain_id = s.id AND s.class_code = 'E33'
);
COMMIT;
Remove description form move events and add it to source description.
BEGIN;
UPDATE model.entity s SET description = description || E'\r\n----\r\n' || (
SELECT e.description
FROM model.entity e
JOIN model.link l ON e.id = l.range_id AND l.property_code = 'P67' and e.class_code = 'E9' AND l.domain_id = s.id AND e.id NOT IN (1672, 1617, 1612, 1673, 1674, 1433, 1421, 1663, 1619, 1435, 1444, 1443, 1596, 1603, 826, 1496, 1493))
WHERE id IN (
SELECT l.domain_id
FROM model.entity e
JOIN model.link l ON e.id = l.range_id AND l.property_code = 'P67' and e.class_code = 'E9' AND e.id NOT IN (1672, 1617, 1612, 1673, 1674, 1433, 1421, 1663, 1619, 1435, 1444, 1443, 1596, 1603, 826, 1496, 1493));
UPDATE model.entity SET description = ''
WHERE id IN (
SELECT l.range_id
FROM model.entity e
JOIN model.link l ON e.id = l.range_id AND l.property_code = 'P67' and e.class_code = 'E9' AND e.id NOT IN (1672, 1617, 1612, 1673, 1674, 1433, 1421, 1663, 1619, 1435, 1444, 1443, 1596, 1603, 826, 1496, 1493));
COMMIT;
Join references
UPDATE model.link SET domain_id = 3204
WHERE property_code = 'P67' AND domain_id IN
(3216, 3218, 3222, 3230, 3234, 3238, 3285, 3287, 3705, 3713, 3715, 3719, 3745, 3815, 3845, 3849, 3853, 4272, 4276, 4281, 4286, 4293, 4297, 4301, 4367, 4369, 4374, 4494, 4731, 4735, 4742, 4926, 5130, 5132, 5137, 5141, 5148, 5149, 5154, 5158, 5232, 5236, 5240, 5245, 5264, 5269, 5273, 5277, 5281, 5286, 5290, 5294, 5295, 5299, 5303, 5307, 5311, 5315, 5319);
DELETE FROM model.entity WHERE id IN
(3216, 3218, 3222, 3230, 3234, 3238, 3285, 3287, 3705, 3713, 3715, 3719, 3745, 3815, 3845, 3849, 3853, 4272, 4276, 4281, 4286, 4293, 4297, 4301, 4367, 4369, 4374, 4494, 4731, 4735, 4742, 4926, 5130, 5132, 5137, 5141, 5148, 5149, 5154, 5158, 5232, 5236, 5240, 5245, 5264, 5269, 5273, 5277, 5281, 5286, 5290, 5294, 5295, 5299, 5303, 5307, 5311, 5315, 5319);
To allow Cross Origin Resource Sharing (CORS), OpenAtlas API uses flask-cors. Every path of the /api/ is protected, but the access can come from everywhere right now ('*')
cors = CORS(app, resources={r"/api/*": {"origins": app.config['CORS_ALLOWANCE']}})
Origins point to a global variable, which can be changed in the instance/production.py. Default value:
CORS_ALLOWANCE = '*'
The value can be a case-sensitive string, for a single point of origin, a regex expression, a list or an asterisk (*) as wildcard.
Examples:
CORS_ALLOWANCE = 'https://thanados.net/' CORS_ALLOWANCE = ['https://thanados.net/', 'https://openatlas.eu/'] CORS_ALLOWANCE = r'^((https?:\/\/)?.*?([\w\d-]*\.[\w\d]+))($|\/.*$)'
https://flask-cors.readthedocs.io/
The OpenAtlas data model can store chronological information for certain entities and properties.
In general it can document a timespan for the begin and a timespan for the end of those. The smallest timespan unit possible is one day. There are no restrictions regarding the length of these time spans.
E.g. if you do not know the exact date of an actor's birth, but only that she/he was born at the earliest in 800AD and at the latest before 805AD, this timespan would range from Jan. the 1st, 800 to Jan. the 1st, 805. Given that person died on the 23rd of Aug. 877 you would only enter that day as and end-date. In the user interface this case would look like:
The system automatically creates the correct timespan given by the bounding dates provided by the user:
This way you can store chronological information as precise as possible or as fuzzy as necessary. Also it is possible to add a comment on the date e.g. "circa".
The same principle works for all of the following entities and properties:
Entities
Physical Things
Places (E18)
Features (E18)
Stratigraphic Units (E18)
Finds (E22)
Actors
Persons (E21)
Groups (E74)
Legal Bodies (E40)
Events
Activity (E7)
Acquisition (E8)
Production (E12)
Destruction (E6)
Properties that link actors to Events
performed (P14)
participated in (P11)
acquired title through (P22)
surrendered title through (P23)
Properties that link actors to Actors
has relationship to (OA7)
is current or former member of (P107)
The data is stored in the table model.entity, i.e. model.link in the fields begin_from, begin_to, begin_comment, end_from, end_to, end_comment as timestamps.
Depending on class of the entity, that is to say, the domain and range classes of the link, these dates can be mapped as time primitive (E61) entities within the CIDOC CRM.
The respective paths are the following:
E77 (Persistent Item) - P92i (was brought into existence by) - E63 (Beginning of Existence) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Holy Lance (E22)] was brought into existence by [forging of Holy Lance (E12)] has time span [Moment/Duration of Forging of Holy Lance (E52)] ongoing throughout [0770-12-24 (E61)]
E77 Persistent Item end linked with a E61 Time Primitive:E77 (Persistent Item) - P93i (was taken out of existence by) - E64 (End of Existence) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [The one ring (E22)] was destroyed by [Destruction of the one ring (E12)] has time span [Moment of throwing it down the lava (E52)] ongoing throughout [3019-03-25 (E61)]
E21 Person's Birth linked with a E61 Time Primitive:E21 (Person) - P98i (was born) by - E67 (Birth) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Stefan (E21)] was born by [birth of Stefan (E12)] has time span [Moment/Duration of Stefan's birth (E52)] ongoing throughout [1981-11-23 (E61)]
E21 Person's Death linked with a E61 Time Primitive:E21 (Person) - P100i (died in) - E69 (Death) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Lady Diana (E21)] died in [death of Diana (E69)] has time span [Moment/Duration of Diana's death (E52)] ongoing throughout [1997-08-31 (E61)]
E2 Temporal Entity (also property) begin linked with a E61 Time Primitive:E2 (Temporal Entity) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Thirty Years' War (E7)] has time span [Moment/Duration of Beginning of Thirty Years' War (E52)] ongoing throughout [1618-05-23 (E61)]
E2 temporal entity (also property) end linked with a E61 Time Primitive:E2 (temporal entity) - P4 (has time span) - E52 (Time Span) - P81 (ongoing throughout) - E61 (Time Primitive)
Example: [Thirty Years' War (E7)] has time span [Moment/Duration of End of Thirty Years' War (E52)] ongoing throughout [1648-10-24 (E61)]
These are advanced installation notes for a Debian server to deploy OpenAtlas.
We use this instruction for our workflow, it is very specific and detailed (e.g. changing the prompt to color and show git information) so feel free to use/adapt as needed.
apt install aptitude ntpsec vim
OpenAtlas user interfaces are currently supported for English and German, you might want to install needed languages with this command:
dpkg-reconfigure locales
PermitRootLogin no PasswordAuthentication no
# apt-get install unattended-upgrades apt-listchanges
Unattended-Upgrade::Mail "root";
rkunter is a Unix-based tool that scans for rootkits, backdoors and possible local exploits. To install rkhunter and prevent false positives when deploying OpenAtlas follow the instructions below.
Installation
apt install rkhunter
Configurationvim /etc/rkhunter.conf
vim /etc/rkhunter.conf.local
rkhunter -c --sk
rkhunter --propupd
vim /etc/default/rkhunter
apt install apache2 (needed for permissions and file structure)
groupadd web-admin usermod -a -G web-admin alex chgrp -R web-admin /var/www chmod -R 775 /var/www chmod g+s /var/www
vim ~/.profile
umask 002
mkdir /var/www/openatlas mkdir /var/www/frontend
vim ~/.bashrc
function parse_git_dirty { [[ $(git status 2> /dev/null | tail -n1) != "nothing to commit, working tree clean" ]] && echo "*" } function parse_git_branch { git branch --no-color 2> /dev/null | sed -e '/^[^*]/d' -e "s/* \(.*\)/[\1$(parse_git_dirty)]/" } PS1='\[\e[1;34m\]\u@\h:\w\[\e[0;32m\]$(parse_git_branch)\[\e[1;34m\]\$ \[\e[m\]'
Change default editor to vim:
update-alternatives --config editor
Next follow the instructions how to install OpenAtlas: https://github.com/craws/OpenAtlas/blob/main/install.md
For git e.g. ACDH-CH:
$ git config --global http.proxy http://fifi.arz.oeaw.ac.at:8080
$ pip3 install --proxy=http://fifi.arz.oeaw.ac.at:8080 calmjs $ npm config set proxy http://fifi.arz.oeaw.ac.at:8080
# a2dismod autoindex # service apache2 restart
sudo apt install brotli
sudo a2enmod brotli
<IfModule mod_brotli.c>
AddOutputFilterByType BROTLI_COMPRESS text/html text/plain text/xml text/css text/javascript application/javascript application/json application/xml
BrotliCompressionQuality 4
</IfModule>
sudo service apache2 restart
(For sites on ACDH-CH servers ignore this, the certificate has to be managed by the proxy server.)
# apt install certbot python3-certbot-apache # certbot --apache # certbot
After configuration of certbot, uncomment the line with WSGIDaemonProcess in /etc/apache2/sites-available/XXX.conf before creating certificates for OpenAtlas instances.
System mails (e.g. from cron jobs) are implemented with msmtp
# apt install msmtp msmtp-mta ca-certificates # vim /etc/msmtprc # vim /etc/aliases # msmtp root (to test, write some lines to not get flagged as spam and than CTRL + D)
SQL statements used for producing demo data
Todo: Adapt like demo-dev SQL
BEGIN; -- Disable triggers, otherwise script takes forever and/or run into errors ALTER TABLE model.entity DISABLE TRIGGER on_delete_entity; ALTER TABLE model.link_property DISABLE TRIGGER on_delete_link_property; -- Delete data from other users than Sonja and Petra DELETE FROM model.entity WHERE id IN ( SELECT entity_id FROM web.user_log WHERE action = 'insert' AND class_code IN ('E33', 'E6', 'E7', 'E8', 'E12', 'E21', 'E40', 'E74', 'E18', 'E31', 'E84') AND user_id NOT IN (21, 16)); -- Delete unrelated user DELETE FROM web.user WHERE username NOT IN ('Alex', 'Demolina', 'jpreiser', 'pheinicker', 'sduenneb'); -- Insert demo user INSERT INTO web.user (username, real_name, email, active, group_id, password) VALUES ( 'Demolina', 'Demolina', 'demolina@example.com', True, (SELECT id FROM web.group WHERE name = 'editor'), '$2b$12$9T05T1IiCnlEiUdf5gSosuSYewK5Rf4T/PwuvbSXEooR95BG2kgvG'); -- Disable email, set sitename and other settings UPDATE web.settings SET value = '' WHERE name = 'mail'; UPDATE web.settings SET value = '' WHERE name LIKE 'mail_%'; UPDATE web.settings SET value = 'openatlas@craws.net' WHERE name LIKE 'mail_recipients_feedback'; UPDATE web.settings SET value = '1' WHERE name = 'file_upload_max_size'; UPDATE web.settings SET value = 'Demo' WHERE name = 'site_name'; -- Update content UPDATE web.i18n SET text = '<p>Demo site for <a href="http://openatlas.eu/">OpenAtlas</a> projects. <a href="/login">Login</a>.</p> <p>The data will be reset daily around midnight. Demo data kindly provided by:</p> <p><strong>Mapping Medieval Conflicts (MEDCON). A digital approach towards political dynamics in the pre-modern period</strong><br /><br />MEDCON was funded within the go!digital-programme of the Austrian Academy of Sciences (OEAW) from October 2014 to May 2017 and hosted at the Institute for Medieval Research of OEAW The project headed by Johannes Preiser-Kapeller examined the explanatory power of concepts of social and spatial network analysis for phenomena of political conflict in medieval societies.<br /><br />The data presented in this demo version stems from two of MEDCON´s case studies, “Emperor Frederick III and the League of the Mailberger coalition in 1451/52” (executed by Kornelia Holzner-Tobisch and Petra Heinicker) and “Factions and alliances in the fight of Maximilian I for Burgundy” (Sonja Dünnebeil).<br /><br />For further information on the project see: <a href="http://oeaw.academia.edu/MappingMedievalConflict" target="_blank" rel="noopener noreferrer">Mapping Medieval Conflict</a> or contact Johannes.Preiser-Kapeller@oeaw.ac.at.</p> <p><strong>OpenAtlas</strong></p>' WHERE name = 'intro' AND language = 'en'; UPDATE web.i18n SET text = '<p style="text-align: left;">Demo Seite für <a href="http://openatlas.eu/">OpenAtlas</a> Projekte. Zum <a href="/login">Login</a>.</p> <p>Die Daten werden täglich gegen Mitternacht zurückgesetzt. Demo Daten freundlicherweise zur Verfügung gestellt von:</p> <p><strong>Mapping Medieval Conflicts (MEDCON). A digital approach towards political dynamics in the pre-modern period</strong><br /><br />MEDCON wurde durch das go!digital-Programm der Österreichischen Akademie der Wissenschaften (ÖAW) finanziert und vom Oktober 2014 bis zum Mai 2017 am Institut für Mittelalterforschung der ÖAW durchgeführt. Das Projekt untersuchte unter der Leitung von Johannes Preiser-Kapeller die Erklärungskraft von Konzepten der sozialen und geographischen Netzwerkanalyse für Phänomene des politischen Konflikts in mittelalterlichen Gesellschaften.<br /><br />Die Daten, die in dieser Demo-Version präsentiert werden, stammen aus zwei Fallstudien von MEDCON, „Kaiser Friedrich III. und die Liga der Mailberger Koalition, 1451/52“ (durchgeführt durch Kornelia Holzner-Tobisch und Petra Heinicker) und „Fraktionen und Allianzen im Kampf von Maximilian I. um Burgund“ (Sonja Dünnebeil).<br /><br />Weitere Informationen zum Projekt finden Sie hier: <a href="http://oeaw.academia.edu/MappingMedievalConflict" target="_blank" rel="noopener noreferrer">Mapping Medieval Conflict</a> (bzw. Kontakt: Johannes.Preiser-Kapeller@oeaw.ac.at).</p> <p><strong>OpenAtlas</strong></p>' WHERE name = 'intro' AND language = 'de'; UPDATE web.i18n SET text = 'Webmaster: alexander.watzinger@craws.net' WHERE name = 'contact' AND language = 'en'; UPDATE web.i18n SET text = 'Webmaster: alexander.watzinger@craws.net' WHERE name = 'contact' AND language = 'de'; UPDATE web.i18n SET text = '' WHERE name = 'legal_notice' AND language = 'en'; UPDATE web.i18n SET text = '' WHERE name = 'legal_notice' AND language = 'de'; -- Delete orphans manually because triggers are disabled DELETE FROM model.entity WHERE id IN ( SELECT e.id FROM model.entity e LEFT JOIN model.link l1 on e.id = l1.domain_id LEFT JOIN model.link l2 on e.id = l2.range_id LEFT JOIN model.link_property lp2 on e.id = lp2.range_id WHERE l1.domain_id IS NULL AND l2.range_id IS NULL AND lp2.range_id IS NULL AND e.class_code IN ('E61', 'E41', 'E53', 'E82')); -- Delete orphaned translations DELETE FROM model.entity WHERE system_type = 'source translation' AND id NOT IN (SELECT range_id FROM model.link WHERE property_code = 'P73'); -- Re-enable triggers ALTER TABLE model.entity ENABLE TRIGGER on_delete_entity; ALTER TABLE model.link_property ENABLE TRIGGER on_delete_link_property; COMMIT;
After executing test for orphaned locations and delete them with SQLs above.
-- SQL to filter demo data from DPP BEGIN; -- Disable triggers, otherwise script takes forever and/or runs into errors ALTER TABLE model.entity DISABLE TRIGGER on_delete_entity; -- Delete data from other case studies DELETE FROM model.entity WHERE id NOT IN (SELECT e.id FROM model.entity e JOIN model.link l ON e.id = l.domain_id AND l.property_code = 'P2' AND l.range_id IN (SELECT id FROM model.entity WHERE name IN ('Ethnonym of the Vlachs'))) AND class_code IN ('E33', 'E6', 'E7', 'E8', 'E12', 'E21', 'E74', 'E40', 'E31', 'E18', 'E84', 'E22') AND (system_type IS NULL OR system_type NOT IN ('source translation')); -- Delete orphans manually because triggers are disabled DELETE FROM model.entity WHERE id IN ( SELECT e.id FROM model.entity e LEFT JOIN model.link l1 on e.id = l1.domain_id LEFT JOIN model.link l2 on e.id = l2.range_id WHERE l1.domain_id IS NULL AND l2.range_id IS NULL AND e.class_code IN ('E61', 'E41', 'E53', 'E82')); -- Delete orphaned translations DELETE FROM model.entity WHERE system_type = 'source translation' AND id NOT IN (SELECT range_id FROM model.link WHERE property_code = 'P73'); -- Delete unrelated user DELETE FROM web.user WHERE username NOT IN ('Alex', 'dschmid', 'bkoschicek', 'mpopovic', 'jnikic'); -- Insert demo user INSERT INTO web.user (username, real_name, email, active, group_id, password) VALUES ( 'Demolina', 'Demolina', 'demolina@example.com', True, (SELECT id FROM web.group WHERE name = 'editor'), '$2b$12$9T05T1IiCnlEiUdf5gSosuSYewK5Rf4T/PwuvbSXEooR95BG2kgvG'); -- Disable email, set sitename and other settings UPDATE web.settings SET value = '' WHERE name = 'mail'; UPDATE web.settings SET value = '' WHERE name LIKE 'mail_%'; UPDATE web.settings SET value = 'openatlas@craws.net' WHERE name LIKE 'mail_recipients_feedback'; UPDATE web.settings SET value = '1' WHERE name = 'file_upload_max_size'; UPDATE web.settings SET value = 'Development Demo' WHERE name = 'site_name'; UPDATE web.settings SET value = 'Development Demo' WHERE name = 'site_header'; -- Update content UPDATE web.i18n SET text = '<p>Development Demo site for <a href="http://openatlas.eu/">OpenAtlas</a> projects. <a href="/login">Login</a>.</p> <p>The data will be reset daily around midnight. Demo data kindly provided by:</p> <p><strong>The Ethnonym of the Vlachs in the Written Sources and the Toponymy in the Historical Region of Macedonia</strong> (11th-16th Cent.) <a href="http://dpp.oeaw.ac.at/index.php?seite=CaseStudies&submenu=skopje" target="_blank" rel="noopener noreferrer">More Information</a></p> <p>The present demo version is the result of a scholarly project, which was submitted by the digital cluster project “Digitising Patterns of Power (<a href="http://dpp.oeaw.ac.at/" target="_blank" rel="noopener noreferrer">DPP</a>)” at the Institute for Medieval Research (Austrian Academy of Sciences, Vienna) and the Ss. Cyril and Methodius University of Skopje (Faculty of Philosophy, Institute for History). It focuses on the interplay between the resident population and the nomads (i.e. the Vlachs) in the historical region of Macedonia from the 11th to the 16th centuries.<br /><br />This region at the crossroads of Orthodoxy, Roman Catholicism and Islam and the question of the origin of the Vlachs, who identify themselves as a separate ethnic group until modern times, as well as the ethnonym "Vlachs" and its derivatives in the form of toponyms and personal names are at the core of the joint research. Hereby, historical and archaeological research is combined with Digital Humanities.<br /><br />The project, which was successfully submitted by the project coordinators Doz. Dr. Mihailo Popović and Prof. Dr. Toni Filiposki, is funded by the Centre for International Cooperation & Mobility (ICM) of the Austrian Agency for International Cooperation in Education and Research (OeAD-GmbH) for two years (2016-18) and forms an additional case study within DPP. <br /><br />Project teams:<br /><br />Toni Filiposki (project leader / Skopje), Boban Petrovski (Skopje), Nikola Minov (Skopje), Vladimir Kuhar (Skopje), Boban Gjorgjievski (Skopje)<br /><br />Mihailo Popović (project leader / Vienna), Jelena Nikić (Vienna), David Schmid (Vienna)</p> <p><strong>OpenAtlas</strong></p>' WHERE name = 'intro' AND language = 'en'; UPDATE web.i18n SET text = '<p style="text-align: left;">Development Demo Seite für <a href="http://openatlas.eu/">OpenAtlas</a> Projekte. Zum <a href="/login">Login</a>.</p> <p>Die Daten werden täglich gegen Mitternacht zurückgesetzt. Demo Daten freundlicherweise zur Verfügung gestellt von:</p> <p><strong>The Ethnonym of the Vlachs in the Written Sources and the Toponymy in the Historical Region of Macedonia</strong> (11th-16th Cent.) <a href="http://dpp.oeaw.ac.at/index.php?seite=CaseStudies&submenu=skopje" target="_blank" rel="noopener noreferrer">More Information</a></p> <p>The present demo version is the result of a scholarly project, which was submitted by the digital cluster project “Digitising Patterns of Power (<a href="http://dpp.oeaw.ac.at/" target="_blank" rel="noopener noreferrer">DPP</a>)” at the Institute for Medieval Research (Austrian Academy of Sciences, Vienna) and the Ss. Cyril and Methodius University of Skopje (Faculty of Philosophy, Institute for History). It focuses on the interplay between the resident population and the nomads (i.e. the Vlachs) in the historical region of Macedonia from the 11th to the 16th centuries.<br /><br />This region at the crossroads of Orthodoxy, Roman Catholicism and Islam and the question of the origin of the Vlachs, who identify themselves as a separate ethnic group until modern times, as well as the ethnonym "Vlachs" and its derivatives in the form of toponyms and personal names are at the core of the joint research. Hereby, historical and archaeological research is combined with Digital Humanities.<br /><br />The project, which was successfully submitted by the project coordinators Doz. Dr. Mihailo Popović and Prof. Dr. Toni Filiposki, is funded by the Centre for International Cooperation & Mobility (ICM) of the Austrian Agency for International Cooperation in Education and Research (OeAD-GmbH) for two years (2016-18) and forms an additional case study within DPP. <br /><br />Project teams:<br /><br />Toni Filiposki (project leader / Skopje), Boban Petrovski (Skopje), Nikola Minov (Skopje), Vladimir Kuhar (Skopje), Boban Gjorgjievski (Skopje)<br /><br />Mihailo Popović (project leader / Vienna), Jelena Nikić (Vienna), David Schmid (Vienna)</p> <p> <strong>OpenAtlas</strong></p>' WHERE name = 'intro' AND language = 'de'; UPDATE web.i18n SET text = 'Webmaster: alexander.watzinger@craws.net' WHERE name = 'contact' AND language = 'en'; UPDATE web.i18n SET text = 'Webmaster: alexander.watzinger@craws.net' WHERE name = 'contact' AND language = 'de'; UPDATE web.i18n SET text = '' WHERE name = 'legal_notice' AND language = 'en'; UPDATE web.i18n SET text = '' WHERE name = 'legal_notice' AND language = 'de'; -- Re-enable triggers ALTER TABLE model.entity ENABLE TRIGGER on_delete_entity; COMMIT;
The API basically can be accessed through two methods: Either from the user interface of an OpenAtlas application or, if the settings will allow it, from another application.
Please also refer to the SwaggerHub documentation: https://app.swaggerhub.com/apis-docs/ctot-nondef/OpenAtlas/0.2
These endpoints can provide full information about one or more entities. The output format is the Linked Place Format(LPF). Alternativly there is a simple GeoJSON format and multiple RDFs, derived from the LPF, available.
Retrieves a representations of an entity through the ID.
Retrieves a json with a list of entities based on their CIDOC CRM class code. The outpout contains a results and pagination key. All in OpenAtlas available codes can be found under OpenAtlas and CIDOC CRM class mapping. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas view name. Available categories can be found at OpenAtlas and CIDOC CRM class mapping. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a json with a list of entities based on their OpenAtlas system class. Available categories can be found at OpenAtlas and CIDOC CRM class mapping. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
With the query endpoint, one can combine the three endpoints above in a single query. Each request has to be a new parameter. Possible parameters are:
For more details of the different queries please consult the associated section. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves the latest entries made in the OpenAtlas database. The number represents the amount of entities retrieved. /latest can be any number between and including 1 and 100.
Retrieves a list of entities, which are linked to the entity with the given id. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. A possible id can be obtained for example by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a list of entities based on their OpenAtlas type. This also includes all entities, which are connected to an subtype. A possible id can be obtained for example by the type_tree or node_overview endpoint. The result can be filtered, ordered and manipulated through different parameters. By default results are orderd alphabetically and 20 entities are shown.
Retrieves a JSON list of entities names, IDs and URLs, based on their OpenAtlas type. Be aware that "Historical Place" and "Administrative Units" cannot be retrieved by this.
Retrieves a JSON list of entities names, IDs and URLs, based on their OpenAtlas type. This also includes all subtypes of the given type id. Be aware that "Historical Place" and "Administrative Units" cannot be retrieved by this.
Retrieves a JSON list of entities names, IDs and URLs, from the first layer of subunits from the given entity ID.
Retrieves a JSON list of entities names, IDs and URLs, of all subunits from the given entity ID.
Retrieves a detailed JSONlist of all OpenAtlas types. This includes also includes a list of childrens of a type
Retrieves a JSON list of all OpenAtlas types sorted by custom, places, standard and value.
Retrieves a JSON list of all OpenAtlas types sorted by custom, places, standard and value. This is just a rename from node_overview/
Takes only a valid place (E18) ID. Retrieves a list of the given place and all of its subunits. This endpoint provides a special Thanados Format. With the format=xml parameter, an XML can be created,
Provides a list of all available system classes, their CIDOC CRM mapping, which view they belong, which icon is used and the English name
Retrieves a json of the content (Intro, Legal Notice, Contact and the size for processed images) from the OpenAtlas instance. The language can be chosen with the lang parameter (en or de).
Retrieves a list of all selected geometries in the database in a standard GeoJSON format. This endpoint should be used for map overviews.
Retrieves a list of how many entities, a system class has.
Provides the image of the requested ID. Be aware, the image will only be displayed if:
The development of OpenAtlas depends on many factors and is very fluid. Here you find explanations of keywords and concepts used in planning.
A feature is an isolated task. It can be a new functionality (e.g. an upload function for a logo), added functionality to an existing feature (e.g. adding new export formats) or a fix for a bug (e.g. repairing a button that isn't working as expected). Ideally, it should not take more than a week to implement one feature. If it is expected that a feature takes much longer, we try to break it up into smaller tasks. Features are either planned in meetings or requested by users.
The version naming scheme is sequence-based and reflects the significance: major.minor.fix e.g. 3.11.1
The first number is raised rarely and symbolizes major changes and/or breaking changes. E.g. after the whole application was rewritten in a new language (from PHP to Python) the first number was raised from 2 to 3.
The second number is the one that is raised most often. It includes a collection of features, typically around two to four, and should be doable in about a month. When a minor version is finished and uploaded to the productive systems it is also a good time to check how the development goes along and to make adaptations to the planning if needed.
The third number is for fixing errors. Since we think that errors should be resolved as fast as possible we often don't want to wait a month for the next minor release, especially on productive system where people are working on their projects. These are quite infrequent depending on reported errors and their gravity.
There are many feature requests (ideas for new functionality). Usually we plan ahead about three minor versions (around ten features, doable in about a quarter of a year) and call it the Roadmap. The roadmap has to be flexible because of new projects or changed requirements of existing ones. E.g. new features are often beneficial for multiple projects and so the priorities can shift to reflect this.
Many factors come into play when deciding which features are put onto the roadmap. One major factor are the requirements of projects who are financing the development of OpenAtlas but there are other factors as well like:The wishlist is a special version in the roadmap. It is a collection of ideas and suggestions which we would like to implement in the future. In a new version we not only include critical features, but also try to implement features from the wishlist. The Roadmap is accessible for the public so other projects or involved people can browse through it and identify features, which are imperative for their needs.
If and when a feature is taken from the wishlist and included in a new version depends on several factors like funding, the usefulness for the projects’ majority, if features need prerequisites to be fulfilled, etc. It also can happen, that features which are already planned for the next versions are moved back to the wishlist, e.g. if it turns out to be much more time consuming than expected or if the project requesting that feature decides to change their own priorities.
We try to keep this list current and take your votes (query) into account when planning next features. Please keep in mind that there are other criteria as well e.g. there may be technical prerequisites.
Sadly the still open request from 2008 at Redmine Add voting to tickets was never implemented so we have to do it manually.
Votes | Issue | Title | Prerequisite |
---|---|---|---|
1 | #1352 | Tool for Anthropological Analyses | #1660 |
{
"type":"FeatureCollection",
"features":[
{
"type":"Feature",
"geometry":{
"type":"Point",
"coordinates":[
15.6432715260176,
48.5867361212989
],
"title":"",
"description":""
},
"properties":{
"@id":50505,
"systemClass":"place",
"name":"Thunau Obere Holzwiese",
"description":"In the area of Obere Holzwiese 215 inhumation burials were documented in different excavations. The cemetery ranges from the (later) 8th c. to the (early) 10th c.\r\n##German\r\n215 Bestattungen wurden im Bereich der Oberen Holzwiese dokumentiert; im NW-Areal wird eine Holzkirche vermutet, die wohl bereits in der ersten Hälfte des 9. Jahrhunderts vorhanden war.",
"begin_earliest":"0750-01-01",
"begin_latest":"0750-12-31",
"begin_comment":null,
"end_earliest":"0950-01-01",
"end_latest":"0950-12-31",
"end_comment":null,
"types":[
"Inhumation Cemetery",
"Excavation"
]
}
}
]
}
These options can be configured in your global git config file e.g. ~/.gitconfig on Linux:
[user] name = Your Name email = your@email.net [color] ui = true [core] editor = vim [fetch] prune = true [merge] tool = meld [mergetool] keepBackup = false [pull] rebase = false [status] showUntrackedFiles = all
git config --global fetch.prune true
vim ~/.bashrc
function parse_git_dirty { [[ $(git status 2> /dev/null | tail -n1) != "nothing to commit (working directory clean)" ]] && echo "*" } function parse_git_branch { git branch --no-color 2> /dev/null | sed -e '/^[^*]/d' -e "s/* \(.*\)/[\1$(parse_git_dirty)]/" } PS1='\[\e[1;34m\]\u@\h:\w\[\e[0;32m\]$(parse_git_branch)\[\e[1;34m\]\$ \[\e[m\]'
We are using git as a versioning system using the Git Branching Workflows.
There is some discussion in the global community that this is hindering continuous integration but we like to have a develop branch to, e.g. catch issues before they go into production.
Because we have a fast workflow and merge there often it works pretty well and is "continuous" enough for our purposes.
If working on a new feature/fix:
git checkout develop git checkout -b feature/something
git add . git commit -m "Implemented feature something"
git merge develop
git checkout develop git merge feature/something
About once a month a new release_candidate branch is made from develop which, after some quality checks, is than released and merged into the main branch.
When working in the release_candidate branch please also merge it to the develop branch afterwards.
If there are database changes make sure that
This document was written in 2015 and may be quite outdated. It is kept for historical reasons.
This section contains information on the web-based GUI of OpenAtlas for inserting, editing and managing data.
Requirements: Before accessing the UI a user has to login with username and password. If the information is correct the user is redirected to a starting page.
Fields: Username, Password
Buttons: OK
Additional ideas: "forgot my password" button/hyperlink
A menu bar or something similar allowing the user to handle the UI. Global
tbd
The first page displayed after login. Offering easy Navigation for the most often used things.
For now: Navigate between source and actor
Generally:
OpenAtlas uses Documents E31 and Linguistic Objects E33 to refer to other entities that are stored in the database.
To record that an Information object or document like an image, an article, a book (E31) or the text of a charter (E33) refers to another entity p67 refers to / is referred to by is used.
A page to insert, edit sources like texts (e.g. medieval charters, journal articles, bibliographic references) or images (photographs, drawings, maps).
Prerequisite: Dublin Core: http://de.wikipedia.org/wiki/Dublin_Core
--> Record 15 core elements:
Fundamental Question: How to handle the metadata?
Map them within CIDOC CRM?: http://www.cidoc-crm.org/docs/dc_to_crm_mapping.pdf
Or add metadata in HTML/RDF/XML?: http://de.wikipedia.org/wiki/Dublin_Core#Anwendungen_von_Dublin_Core
Text Sources/integrate BibTex Format? http://bibtexml.sourceforge.net/
Suggestion: Store metadata in separate table, using Dublin Core's - Core Elements. Import <> export from/to various formats (eg. RIS/Bibtex).
User Interface for inserting/editing bibliographical information. Bibliographical information is stored as an entity of class E31 (document) that has the type "Text" or subtype. OpenAtlas will use Bibtex format stored as string in description field to record bibliographical entries. http://en.wikipedia.org/wiki/BibTeX
Types:
Referenzart | Beschreibung | erforderliche Felder | optionale Felder |
---|---|---|---|
article | Zeitungs- oder Zeitschriftenartikel | author, title, journal, year | volume, number, pages, month, note |
book | Buch | author oder editor, title, publisher, year | volume oder number, series, address, edition, month, note, isbn |
booklet | Gebundenes Druckwerk | title | author, howpublished, address, month, year, note |
conference | Wissenschaftliche Konferenz | author, title, booktitle, year | editor, volume oder number, series, pages, address, month, organization, publisher, note |
inbook | Teil eines Buches | author oder editor, title, chapter und/oder pages, publisher, year | volume oder number, series, type, address, edition, month, note |
incollection | Teil eines Buches (z. B. Aufsatz in einem Sammelband) mit einem eigenen Titel | author, title, booktitle, publisher, year | editor, volume oder number, series, type, chapter, pages, address, edition, month, note |
inproceedings | Artikel in einem Konferenzbericht | author, title, booktitle, year | editor, volume oder number, series, pages, address, month, organization, publisher, note |
manual | Technische Dokumentation | address, title, year | author, organization, edition, month, note |
mastersthesis | Diplom-, Magister- oder andere Abschlussarbeit (außer Promotion) | author, title, school, year | type, address, month, note |
misc | beliebiger Eintrag (wenn nichts anderes passt) | - | author, title, howpublished, month, year, note |
phdthesis | Doktor- oder andere Promotionsarbeit | author, title, school, year | type, address, month, note |
proceedings | Konferenzbericht | title, year | editor, volume oder number, series, address, month, organization, publisher, note |
techreport | veröffentlichter Bericht einer Hochschule oder anderen Institution | author, title, institution, year | type, note, number, address, month |
unpublished | nicht formell veröffentlichtes Dokument | author, title, note | month, year |
The UI takes the user Input from the text boxes, generates a string like the following and stores it in the "description" field.
@Book{hicks2001, author = "von Hicks, III, Michael", title = "Design of a Carbon Fiber Composite Grid Structure for the GLAST Spacecraft Using a Novel Manufacturing Technique", publisher = "Stanford Press", year = 2001, address = "Palo Alto", edition = "1st", isbn = "0-69-697269-4"
The UI displays a formatted version of this bibliographical entry in the "reference" tab and the Source Text in the "source" tab.
Idea for future developments: Import/Export .bib format
Any entity can be linked via P67 "is referred to by" to a document.
Suggestion: at least Dublin Core + Technical Metadata like EXIF
Historical sources (E.g. medieval charters, chronicles etc.) are documented on three levels: 1. Content, 2. Bibliographical Reference and 3. Information Carrier
The main node for recording historical sources is an E33 entity with the type "source content". All other entities are connected to this node.
Name: stored in identifier field.
Classification: P2 "has type" link to E55 type "Primary Source" or Subtype (e.g. charter...).
Content
Content of a written historical source is documented as E33 (Linguistic Object). Three parts are recorded:
Abstract/Content: E33 with a Type E55 that is "Source Content" = Summary of source's content and main node for further links. Stored in the entity's description field.
Translated Content: E33 with a Type E55 that is "Source Translation" = full text translated.
Link: "Source Content" via p73 "has translation" to the "Source Translation" (1 - n). Name and type of this entity are automatically generated by the ui. Type is "Source Translation" and the name something like name of source content + suffix "transl"
Original: E33 with a Type E55 that is "Source Original Text" = full text original.
Link: "Source Content" via p73 "has translation" to the "Source Original" (1 - n) Name and type of this entity are automatically generated by the ui. Type is "Source Original Text" and the name something like name of source content + suffix "orig"
Bibliographical References
The content (=E33 with type "Source Content") can be linked to various bibliographical references (E31 document) (1 - n). They can be for example various editions of charters in which the respective source is documented. Also secondary literature like articles, descriptions, discussions of this source can be linked to the E33 "Source Content" object via P67 "is referred to by".
Information Carrier
The content (=E33 "Source Content") can be linked (1 - n: via p128 "is carried by") to the physical object that serves as Information Carrier (E 84) like for example the charter made of parchment that, as physical object, can have a location, an owner etc.
Source Mapping: CIDOC-Mappings
Insert new source:
source content linked (p67i is referred to by) 1-n to editions (=document E31 with type charter)
source content linked (p128i is carried by) 1-n to information carriers (e84)
Later displayed in the info field of the source/document
Every content may have a physical object that carries the respective Information. In the case of historical research this will most probably be charters as physical objects that have a certain place in space and time that can of course change too.
For the first version we will use an "Information carrier light" form
Fields:
Information Carrier name (entity name); description + "überlieferung" (entity description)
They represent an E84 Entity (Information Carrier) that is linked (here 1 source content - n inform. carrier) to the source content entity of the document (E33 with type "source content") via p128 (is carried by)
this E84 entity is linked to an E55 type Entity (from http://redmine.craws.net/projects/uni/wiki/Basic_Types#Information-Carrier-Types) that can be selected
Time and place of production: If necessary an invisible production event will be created and linked to the E84 entity. This production event can again be linked to time primitives (via oa5 and oa6) and to a place/location of a physical thing-combination.
deprecated:
This information carrier (Class E84) can be linked to various other nodes:
Owner: link from E86 via P49 ("has former or current owner") to Actor or Subclass (1 - 1): This is used to document e.g. the Archive or Library or Collection that keeps the respective charter.
Signatory: link from E86 via P105 ("right held by") to Actor or Subclass (1 - n): This is used to document the original contracting party, person or legal body for whom the charter was created.
If a charter was written for the medieval Monastery of Ötting because Ötting was the receiver of a donation from the King, this charter is used by the legal body Ötting to document the right of possession of a certain property. This charter today is kept in the Carinthian Archive "Kärntner Landesarchiv". The mapping would be like the following:
The information carrier was of course produced at a certain time and place and involved into this production were certain actors. The production of the object may also have been part of a superior event.
Thus the information carrier is automaticall linked to its production event and this event can be linked to various other entities (see last figure of the linked section).
Together with the connecting property in some cases a detailed definition can be stored in the link table. E.g. the role of a producing Actor (Writer, Sponsor) or the reference a charter has in a certain archive.
Add new document Step 1:
Idea: Workflow for inserting documents
E33 linguistic object type "source content"
1. Type of document: select from "primary source" hierarchy (=link from E33 via P2 to E55) 1-1
necessary information
2. "Quellkürzel" = Name, Signature, Abbreviation or whatever identifies the document (=name of E33) 1-1
necessary information
only if the above two are inserted the others will activate
3.1. Edition: Charter editions, 1-n links from E33 via p67i to E31 (type "edition")
search for existing editions. If found: activate Pages
if not found: add new one in new tab/popup? and link to source content afterwards (automatically?)
3.2. Pages or nr are stored in links description
only active if Edition is selected.
3.3.
+ Add one more Edition link
- Remove: Ask if only link to E31 should be removed or the whole edition entry too
Edit: Edit edition entry
4.1 "Informationsträger" = original charter or physical thing that carries the information E33 linked via p128 to E84.
select existing ones, if found display respective information
If not found add new Information Carrier's name directly here. UI creates a E84 entry with this name and a link to E33
4.2. Pages or Nrs are stored in links description
only active if Edition is selected.
4.3. "Ausstellungsort" - "Place of Production": E18/E53 Combination linked via P7 to a production event E12 that p108 produced the Information carrier
if not yet defined: Select from existing places/physical things.
if not found: Add new one by entering name. If the location is desired, open detailed (Site) view by clicking on the globe Icon to select coordinates, geosearch or whatever you want to to on the map...
4.4. Time of Production OA5/OA6 links from E12 Production to Time Primitives from/to values.
if not yet defined: Use timespan menu for chronological Information
4.5. Buttons
+ Add one more Information Carrier
- Remove: Ask if only link to E84 should be removed or the whole information carrier network
Edit: Edit information carrier
5.1. Bibliographic Reference: Secundary Sources, 1-n links from E33 via p67i to E31 (type "secundary source")
search for existing references. If found: activate Pages
if not found: add new one in new tab/popup? and link to source content afterwards (automatically?)
3.2. Pages or nr are stored in links description
only active if Reference is selected.
3.3.
+ Add one more Reference link
- Remove: Ask if only link to E31 should be removed or the whole reference entry too
Edit: Edit reference entry
OpenAtlas currently uses:
E5 (Event) in general to record events that are more detailed defined via p2 (has type) to E55 (Type). E.g. E5 (Event of Battle between Henry I. and the Hungarians) p2 has type E55 Battle (1-1)
E8 (Acquisition) to record events that change the ownership of something from one actor to another
E12 (Production) to record the creation/founding of something. E.g. the foundation of the Monastery of St. Peter in Salzburg.
E6 (Destruction) to record the destruction/end of something. E.g. the destruction of the church of St. Peter near Moosburg in a fire.
next to the general classification "types" defined freely by the users can be used to categorize the event's character. Event p2 has type(1-n) E55 (See Basic Types)
One event can be part of another event. E.g. the battle of Hastings (E5) was part of (p117 occurs during)the Norman conquest of England (E5); The Donation of the Church St. Peter near Osterwitz (E8) occurs during the Synody of Maria Saal 927 (E5): E5 or Subclass linked to E5 or Subclass via p117.
One event can take place (P7) at a certain physical thing/place (=E18+E53) like e.g. the court of Charlemagne in Aachen.
Further relations:
Events and actors:
In this tab/section actors that are involved in this event should be listed. Their connection to the event is defined
1. by a property that marks the general involvement of the actor
p11 had participant/participated in
p14 carried out by/performed
p22 transferred title to/acquired title through
p23 transferred title from/surrendered title through
2. by a type that marks the role of the actor (Basic Types)
E55 type of the link/property
E.g. Michelangelo (E21) performed (p14) the painting of the Sistine Chapel (E12) in the role as Artist/Creator (E55) while Pope Julius II (E21) performed (p14) this production of the painting (E12) as sponsor (E55)
Events and Physical Things
Here (by now) only physical things are listed that are directly connected to the event E.g. if they are sold/donated/traded p24 transferred title of/changed ownership through within an acquisition event (E8).
Events and Sub Events
Here Sub-Events are listed that occur during the main event. They are linked via p117 to the superior event.
Class: [select Person/Group/Legal Body]
Name: [identifier = text]
Alternative Name(s): [1-n: P131 – E82] (visible after + button (next to name) is pressed)
Description: [text]
Begin: [1-1 Place, date, period; OA9 – E53; 0A1 – E61; OA2 - E4]
End: [1-1 Place, date, period; OA10 – E53; 0A3 – E61; OA4 - E4]
Residence: [1-n P74 – E53]
Connections to Events [1-n E39 + role + E5]
Affiliations: [1-n P107 – E39 + role E55]
Direct Relations to Persons: [1-n OA7 – E39 + role E55]
Indirect Relations to Actors: via Events
Indirect Relations to Places: via Events
Indirect Relations to Physical things: via Events
Direct Relations to Physical things: Property: [1-n P51i – E18]
Bibliography [1-n P67 - E31]
Primary Sources [1-n] P67 - E33
Images [P67 - E31]
External references [tbd]
Files [tbd]
Physical Things (non moveable as well as moveable) are mapped as a combination of an E18 (non moveable) or E19 (moveable = Finds) entity with an E53 Place entity via a P53 (has former or current location) link.
Type: [E18/E19 – p2 has type – E55] "Site", "Feature", "Stratigraphical Unit", "Find" or Subtype, Class (E18 or E19) is determined automatically as E18 or E19 depending on the GUI
Name: [identifier = text]
Alternative Name(s): [1-n: E18/E19 – P131 – E82] (visible after + button (next to name) is pressed)
Description: [text]
Begin: [E18/E19 – 0A1 – E61] with from-to values + suffix if necessary
In case this date is not known exactly, two time primitives can be recorded to mark a certain temporal span in which the beginning took place. The first timestamp therefore is connected (p2 has type) with a type (E52) "from value", the second with a "to value" type (=Subtypes of "Numeric Value Types"). If one exact date is known this one gets the type "exact value".
End: [E18/E19 – 0A2 – E61] with from-to values + suffix if necessary
In case this date is not known exactly, two time primitives can be recorded to mark a certain temporal span in which the end took place. The first timestamp therefore is connected (p2 has type) with a type (E52) "from value", the second with a "to value" type (=Subtypes of "Numeric Value Types"). If one exact date is known this one gets the type "exact value".
Image(s) [E18/19 - P67 - E31 + E55 "image"]
Easting: [number]
Northing: [number]
both values are stored as WGS84 coordinates in decimal degree format.
select from map: alternatively the user can select a point from the map and the coordinates will be retrieved from the geographical position of the mouse-click. The cursor should turn crosshairs when this option is active and the mouse moves into the map-window.
Map: leaflet map that shows the current physical things with the same supertype (e.g. Site) with some basic navigation and search functionality.
Question: Where to store the coordinates resp. where to create the postGIS Point geometry?
Needed columns:
Easting (double)
Northing (double)
EPSG/SRID (integer)
Geom (Point) ###important: PostGIS Point format, not Postgres Point format###
create a separate table? add columns to entity table?
Quality [E53 - p2 has type - E55 type of "Localisation Quality" Subtype] this option documents the spatial fuzzyness resp. the quality of the localisation. E.g. if the point is known exactly, located at an unknown position within a larger area etc.
Is situated within [E53 1-n p89 - (E53 + E5)] + record specifications like parcel number etc.
Evidence [E18/E19 - p2 has type - E55 type of "Evidence" Subtype]
Periods: tbd
further Relations:
Bibliography [1-n E18/E19 - P67 - E31+E55 "Text"]
Primary Sources [1-n E18/E19 - P67 - E33+E55 "Primary Source Content"]
Events that occur here or influence this place [tbd]
Actors linked to this place [tbd]
Files [tbd]
External references [tbd]
Features/Subunits [tbd]
Shapes [tbd]
Dimensions [tbd]
Participants: Bernhard Koschicek, Alexander Watzinger, Stefan Eichert, Christoph Hoffmann, Nina Brundke, Roland Filzwieser, Jennifer Portschy
Location: Museum of Natural History, Vienna
To create an issue you'll have to be a registered user. You can create an account at https://redmine.openatlas.eu/account/register. It will take a few hours up to a few days to get activated.
The important ones are subject and description, when in doubt leave the other fields empty.
A short description e.g. "Search function for people"
Describe your request or problem and provide an URL. A lot of information e.g. which version you use is available with just an URL.
If it is a bug the more information the better. E.g. if a button isn't working as expected providing the following would be very helpful:
For text formatting you can refer to the Redmine documentation
This field is used to describe the type of issue.
Bug | If you found an error. |
Feature | If you want a new feature e.g. you would like a search function for a list view. |
Question | If you want to ask something |
Administration | Administrative tasks e.g. a server upgrade or planning an event |
Leave on new if it's not yours to work on.
New | It's a new issue. |
Acknowledged | It has been seen and accepted. |
Assigned | Someone is assigned to work on it. |
In progress | Someone began working on it. |
Resolved | It's dealt with but not closed, e.g. performance changes but it still has to be tested. |
Closed | The issue has been successfully dealt with. In case the issue isn't solved it will be reopened. |
Rejected | The issue was rejected with an explanation, e.g. it is not possible for technical reasons. |
Duplicate | The issue was already reported. |
This field is not in use, since the priority of issues is determined through their placement in the roadmap. However, because this field can't be removed in Redmine it is still available and "normal" is the only available option.
Choose yourself if you want to work on it, leave it empty otherwise.
Leave empty. It's for internal project planning.
This field is only available for bugs and used to track the version where the bug was found.
Which features are planned for future releases can be seen on the Roadmap.
Please note that development on the OpenAtlas project is a fluid process and milestones will change depending on factors such as:
Issue #1542
pipenv install <package==version>
pipenv update
The folder openatlas/uploads, openatlas/export and openatlas/image_processing will contain files which users uploaded or generated. Therefore, additional volumes has to be created with the mount point /app/openatlas/<folder> (/app/openatlas/uploads) and read-only false.
Backups are made every day at 03:20 to a separate volume, which can be mounted to the container. At the moment only the last 10 Backups are kept.
Not up to date!
Contains the mappings for linked ontologies. We use the Linked Places mappings found in http://linkedpasts.org/assets/linkedplaces-context-v1.jsonld
e.g. "@context": "https://raw.githubusercontent.com/LinkedPasts/linked-places/master/linkedplaces-context-v1.jsonld"
The type of each output is "FeatureCollection", according to GeoJSON-LD , containing one or more Feature objects.
Feature is a list, which can hold one or many feature objects. Each object can contain following elements:
This is a unique and permanent URI pointing to the entity in the OpenAtlas instance.
"@id": "http://thanados.openatlas.eu/entity/50505"
Show CIDOC CRM class code and name based on the Erlangen CRM
"crmClass": "crm:E18_Physical_Thing"
The value for type is "feature" to state, that the object is a feature.
The property element holds one key:value pair, because of the GeoJSON requirement. The key is title, which is the preferred name label of the entity. In the OA form it is called Form field name.
"properties":{
"title":"Gars Thunau Obere Holzwiese"
}
When is the temporally scope of the represented feature. OpenAtlas only supports dates in the YYYY-MM-DD format, so no periods or extensions for approximate and uncertain (for more details: Date).
Each "when" contains one timespans[] . A timespan can contain start{} and/or end{}. start{} or end{} hold timestamp values and the comments for the dates.
"when":{
"timespans":[
{
"end":{
"earliest":"0950-01-01",
"latest":"0950-12-31"
},
"start":{
"earliest":"0750-01-01",
"latest":"0750-12-31"
}
}
]
}
names is a list of Aliases.
"names":[
{
"alias":"Thunau Obere Holzwiese"
}
]
Represents a list of one or more types objects (nodes) for this feature. Identifier refers to the type entity of the type and label is the given name of the type. Hierarchy lists the super types in a string for the THANATOS front end. If the type is a value type, the unit and value is also shown.
"types":[
{
"hierarchy":"Evidence > Archaeology",
"identifier":"http://thanados.openatlas.eu/api/0.1/entity/5099",
"label":"Excavation"
},
{
"hierarchy":"Dimensions",
"identifier":"http://thanados.openatlas.eu/api/0.1/entity/26189",
"label":"Length",
"unit":"cm",
"value":"200"
}
]
A GeoJSON FeatureCollection must have one or more geometry elements to be valid. Therefore each entity, even if it has no geometric references has this entry:
"geometry":{
"geometries":[ ],
"type":"GeometryCollection"
}
Type refers to geometric types according to RFC 7946 - The GeoJSON Format.
Geometries[] is a list of none or more geometric objects. Each of these objects have the following keys:
"geometry":{
"geometries":[
{
"coordinates":[
21.53314304247,
41.528480700127
],
"title":"Ōrěhovь Dolь",
"type":"Point"
},
{
"coordinates":[
[
[
21.519985198975,
41.510119875951
],
[
21.547966003418,
41.509991329146
],
[
21.560153961182,
41.527471351458
],
[
21.546764373779,
41.546745312346
],
[
21.519470214844,
41.546231414598
],
[
21.506080627441,
41.528113909357
],
[
21.519985198975,
41.510119875951
]
]
],
"description":"A model of the village's boundaries.",
"title":"Ōrěhovь Dolь",
"type":"Polygon"
}
],
"type":"GeometryCollection"
}
These are the links to other gazetteers. At the moment OpenAtlas support GeoNames, as described in Place.
"links":[
{
"identifier":"http://www.geonames.org/2763660",
"type":"closeMatch"
}
]
Is a list of one or more entities, which are linked through the CRM to this feature. Label represents the name of the entity, relationTo contains the direct link to this entity and relationType shows the property code from the model with which the entities are linked.
"relations":[
{
"label":"Cenotaph_A",
"relationTo":"http://oa-dev.koschigel.de/api/0.1/entity/57167",
"relationType":"crm:P46_is_composed_of"
},
{
"label":"Excavation",
"relationTo":"http://oa-dev.koschigel.de/api/0.1/entity/5099",
"relationType":"crm:P2_has_type"
}
]
Contains the description of the entity.
"description":[
{
"value":"In the area of Obere Holzwiese 215 inhumation burials were documented in different excavations[..]"
}
]
Is a list of all linked depictions or files. @id links directly to the file, title represents the given name and license contains the given license of the file.
"depictions":[
{
"@id":"http://oa-dev.koschigel.de/api/0.1/entity/117992",
"license":"CC BY-SA 4.0",
"title":"oberleiserberg_map"
}
]
{
"@context":"https://raw.githubusercontent.com/LinkedPasts/linked-places/master/linkedplaces-context-v1.1.jsonld",
"type":"FeatureCollection",
"features":[
{
"@id":"http://127.0.0.1:5000/entity/50505",
"type":"Feature",
"crmClass":"crm:E18 Physical Thing",
"systemClass":"place",
"properties":{
"title":"Thunau Obere Holzwiese"
},
"description":[
{
"value":"In the area of Obere Holzwiese 215 inhumation burials were documented in different excavations. The cemetery ranges from the (later) 8th c. to the (early) 10th c.\r\n##German\r\n215 Bestattungen wurden im Bereich der Oberen Holzwiese dokumentiert; im NW-Areal wird eine Holzkirche vermutet, die wohl bereits in der ersten Hälfte des 9. Jahrhunderts vorhanden war."
}
],
"when":{
"timespans":[
{
"start":{
"earliest":"0750-01-01",
"latest":"0750-12-31"
},
"end":{
"earliest":"0950-01-01",
"latest":"0950-12-31"
}
}
]
},
"types":[
{
"identifier":"http://127.0.0.1:5000/api/0.2/entity/22378",
"label":"Inhumation Cemetery",
"description":null,
"hierarchy":"Place > Burial Site > Cemetery",
"value":null,
"unit":null
},
{
"identifier":"http://127.0.0.1:5000/api/0.2/entity/5099",
"label":"Excavation",
"description":null,
"hierarchy":"Evidence > Archaeology",
"value":null,
"unit":null
}
],
"relations":[
{
"label":"Cenotaph_A",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/57167",
"relationType":"crm:P46 is composed of",
"relationSystemClass":"feature",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"0750-01-01",
"latest":"0750-01-01"
},
"end":{
"earliest":"0950-12-31",
"latest":"0950-12-31"
}
}
]
}
},
{
"label":"Excavation",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/5099",
"relationType":"crm:P2 has type",
"relationSystemClass":"type",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"Inhumation Cemetery",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/22378",
"relationType":"crm:P2 has type",
"relationSystemClass":"type",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"Location of Thunau Obere Holzwiese",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/50510",
"relationType":"crm:P53 has former or current location",
"relationSystemClass":"object_location",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"Obere Holzwiese",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/179483",
"relationType":"crm:P1 is identified by",
"relationSystemClass":"appellation",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"GeoNames",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/155980",
"relationType":"crm:P67i is referred to by",
"relationSystemClass":"reference_system",
"relationDescription":"2763660",
"type":"closeMatch",
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"http://austriaca.at/8066-1inhalt?frames=yes",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/116288",
"relationType":"crm:P67i is referred to by",
"relationSystemClass":"external_reference",
"relationDescription":"Website EPub",
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"https://doi.org/10.2307/j.ctv8xnfjn",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/116289",
"relationType":"crm:P67i is referred to by",
"relationSystemClass":"external_reference",
"relationDescription":"Nowotny 2018 EPub",
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
},
{
"label":"thunau_modified",
"relationTo":"http://127.0.0.1:5000/api/0.2/entity/112760",
"relationType":"crm:P67i is referred to by",
"relationSystemClass":"file",
"relationDescription":null,
"type":null,
"when":{
"timespans":[
{
"start":{
"earliest":"None",
"latest":"None"
},
"end":{
"earliest":"None",
"latest":"None"
}
}
]
}
}
],
"names":[
{
"alias":"Obere Holzwiese"
}
],
"links":[
{
"type":"closeMatch",
"identifier":"https://www.geonames.org/2763660",
"referenceSystem":"GeoNames"
}
],
"geometry":{
"type":"Point",
"coordinates":[
15.6432715260176,
48.5867361212989
],
"title":"",
"description":""
},
"depictions":[
{
"@id":"http://127.0.0.1:5000/api/0.2/entity/112760",
"title":"thunau_modified",
"license":"Bildzitat",
"url":"N/A"
}
]
}
]
}
Whitepaper Geometries
I. Physical things like buildings, settlements, regions, areas etc. that have or originally had a position in space and a certain extend
II. Roads/routes/rivers that have or originally had a position in space and a certain extend
III. (Find-)Spots with no spatial extend that only have point coordinates.
case 1: The extend is known and can be drawn as a polygon that represents the extend=shape of the physical thing
E.g. the shape of a building or the area of excavation or the area of a settlement that can be drawn for example from an aerial photograph or a map.
case 2: The extend is not known but known to be within a larger area with known extend that can be drawn as a polygon.
E.g a no longer existing settlement that is known to have been situated within a known area for example in a valley between two other known settlements.
case 3: The extend is not known but known to be within a larger predefined area with known extend that is already in the database.
E.g an archeological findspot of unknown position that is known to have been situated inside the boundaries of a certain administrative unit.
case 4: The extend is not known but known to be within a larger area with unknown extend that cannot be drawn as a polygon
E.g. a no longer existing settlement that is known to have been situated within the historical boundaries of a no longer existing county.
case 5: There is no extend but only a known centerpoint
E.g. the coordinates where one find has been found.
case 6: Neither the extend nor a vague position within a reasonable larger area are known.
In many cases the exact identification of physical things/places mentioned in sources (e.g. in charters or also in archaeological publications) with one certain and still existing physcial entity and its extend is not possible. Therefore it is necessary to allow multiple possibilites to record possible locations of physical things:
A charter for example may mention one church and today two still existing churches might be identified with the one mentioned. In this case the church-entity from the source should be linked to two possible spatial objects. Here they would be two polygons, representing the extend of the respective church. However, any combination of the above mentioned cases must be technically and conceptually possible in any number. In theory it must be possible to link e.g. a castle known from a charter to the very extend of a still existing castle and at the same time to a vanished castle that is known to have been located within a certain area and also to another possible location within a certain administrative unit etc. etc.
Each entity with spatial position will be represented on the map at least with a marker as point. If polygon data is available these polygons will be shown too.
We want to offer the possibility to define the spatial position in any combination of the aforementioned categories. Therefore the location of the physical entity is connected 1-n to one or more entries in the gis-tables
In the map interface:
1.)
The user should be able to draw polygons to define either the extend of the physical entity or an area in which the physical thing is situated.
Also predefined categories should be chosen to define whether it is extend or area.
These polygons should be editable and deletable.
Methods: leaflet draw polygon and postgis
Form with:
Dropdown for Category selection (shape or area) -
Text field for description
2.)
The user should be able to set markers to define a point of location. In this case a point is drawn and saved to gis_centerpoints. No polygon is drawn.
Postgis to/from Leaflet: see Stefan's drawshapes.js
1. get geojson from existing polygons and show them in the map.
2. Show centerpoints in Map
3. Make polygons from vertices drawn in leaflet and saves new ones and edited ones to the db
4. delete Polygons from leaflet and db
Triggers/Warnings:
5. Create/update centerpoint data automatically after polygon is drawn or edited using postgis ST_PointOnSurface
SELECT ST_AsText(ST_PointOnSurface((SELECT ST_AsText (geom) FROM openatlas.polygon WHERE WHERE id = value of id)::geometry))insert or replace result into gis_centerpoints
6. Warnings if Point data is updated and moved outside of polygon using postgis ST_Intersects
SELECT ST_Intersects(ST_GeomFromText('POINT(value of new X, value of new Y)', 4326), (SELECT geom FROM openatlas.polygon WHERE id = value of id))if true: all is good. if false: alert and do not allow to draw a new centerpoint.
Datainput:
offer workflow
1.
select type of localisation (see 2.)
2.
if 1 - start polygon draw-tool in map and open form (type is automatically "shape")
if 2 - start polygon draw-tool in map and open form (type can be selected from predefined categories)
if 3 - offer selection tool for administrative units
if 4 - offer selection tool for historical regions
if 5 - start markerdraw (as it is by now)
3.
after 1-3: save to db and trigger the creation of a centerpoint
after 4: if historical region has a geometry trigger the creation of a centerpoint else don't
after 5: nothing necessary
Map: use existing map Interface but remove buttons.
after one localisation is done: offer the possibility to add one more (1-n)
Participants: Stefan Eichert, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Location: ACDH meeting room, Vienna
Participants: Stefan Eichert, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann and special guest Rainer Simon
Location: ACDH meeting room, Vienna
Starting with JSON resp. GeoJSON representation of OpenAtlas data using Linked places format https://github.com/LinkedPasts/linked-places
Linked Places uses JSON-LD syntax and can therefore be used as valid RDF too
Using GeoJSON-T https://github.com/kgeographer/geojson-t it can be extended by temporal attributes
"relations" in the syntax can be used to represent CIDOC-CRM mappings
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Location: ACDH meeting room, Vienna
--> We will ask Rainer Simon about possibility to add other entities like actors or sources to the linked places format
Participants: Christoph Hoffmann, Jan Belik, Stefan Eichert, Asil Çetin, Alexander Watzinger
Location: ACDH meeting room, Vienna
This will be a hands on meeting where we implement changes on the fly. Main aim will be to get a release able Bootstrap version but of course we can also collect ideas for future improvements.
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann, Roland Filzwieser
Location: CRAWS Headquarters, Vienna
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Location: ACDH-CH, Alte Burse, Vienna
With OpenAtlas we developed a quite impressive software. Good team dynamic, pleased project partners, rising cooperation requests and positive general feedback shows that we already accomplished a lot and are going a good way. It is much more than just acquiring cooperations and implementing their requirements. We want to develop professional open source software which supports scientific projects and is pleasant to use. The purpose of this meeting is to discuss a concept for further development which is good for us and our users. To explore this we will speak about:
+ already achieved (but of course it's an ongoing effort)
~ there is room for improvement
- missing
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Planning of a general Frontend and API roadmap and a CONNEC specific Frontend in the light of Leeds presence in July 2021.
We plan next workshop at end of March to check our progress and adapt the roadmap if needed.
Call them admninstative units
Merge find, artifact and information carrier to artifact which is also preparation for next CIDOC version. They than also share the same standard type.
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Berni made a cool double chocolate cake to celebrate 6.0.0 release.
Discuss newest cooperation request
We like it and may be an interesting opertunity to add function bibliographical information
Internships: https://www.oeaw.ac.at/acdh/education/acdh-ch-internships
We like the idea and may try it July or August (Alex & Nina)
Image carousel for screenshots at project website
https://getbootstrap.com/docs/4.6/components/carousel/
Layout for #1457: Public notes and #1443: List view for entities missing a specific type
Done, descriptions updated
Allow characters in (value) types
Declined because it's better to keep data structured
Discuss fancy visualizations using d3.js and others
Done by Stefan and Nina
OpenAtlas Discovery
Christoph showed new presentation site proto type (OpenAtlas Discovery) and we discussed deteails.
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschicek, Christoph Hoffmann, Andreas Olschnögger
Participants: Alexander Watzinger, Bernhard Koschiček-Krombholz, Dalibor Pančić
Kickoff meeting for testing and adapting OpenAtlas to be deployed with Kubernetes too (#1542).
Participants: Stefan Eichert, Nina Brundke, Alexander Watzinger, Bernhard Koschiček-Krombholz, Christoph Hoffmann, Andreas Olschnögger
Participants: Nina Brundke, Alexander Watzinger, Andreas Olschnögger, Aleksandra Apic (SHAHI)
Updated information is in color.
Andi also suggested that images would be uploaded directly to an IIIF server and the presentation site would make the connections. Although this may be work out for the presentation site we should discuss it further. It may be better to uploaded them to OpenAtlas the "usually" way to be able to add meta information and link them in the application. It would also prevent issues for archiving, cases where images are linked to multiple entities and so on.
Andi and I decided it would be easier if images are uploaded with the OpenAtlas application. This would have some advantages like the possibility to upload meta information like description, dates or use one image for multiple artifacts.
Updated information in the course of the meeting is in color.
Discussing issues on the roadmap and what information will be available before OpenAtlas (e.g. time, geolocation of images) and how to import them.
Maybe most users should just have the contributor role to prevent changing of vocabularies.
Where would images be saved and what information will be already included.
Updated information in the course of the meeting is in color.
Alexander Watzinger, Bernhard Koschiček-Krombholz, Nicholas Melvani (Constantinople project)
Updated information in the course of the meeting is in color.
Detailed sites of case studies could look like: http://www.scrinium.umk.pl. More examples would be useful (screenshots, links, ... to similar projects)
We will send suggestions for this in the next mail.
Updated information in the course of the meeting is in color.
Updated information in the course of the meeting is in color. Every participant is invited to add and adapt.
The meeting is about showing how to enter data into OpenAtlas, demonstrating new features and discussing further adaptions and workflows.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about workflows when sharing the same database, introducing new features and discussing questions.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting is about the presentation site for CONNEC data: https://frontend-connec.openatlas.eu/
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about workflows when sharing the same database, introducing new features and discussing questions.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting is about the workflow between OpenAtlas and ARCHE for the INDIGO project.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is invited to add and adapt.
This meeting was about CIDOC CRM, SKOS and OpenAtlas, other issues that were discussed:
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting is about the map module in the OpenAtlas user interface
Updated information in the course of the meeting is in color. Every participant is invited to add and adapt.
This meeting is about importing INDIGO thesaurus from ACDH-CH Vocabs into OpenAtlas (#1663)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting is about showing how to enter archeological data into the Shahi OpenAtlas instance for new colleagues.
For information like inventory or accession which is now noted in the description field we add a new reference system Inventory number which is more suited to track this kind of information.
The inventory numbers are related to different institutions like museums and because there are too many to create an own reference system for each we only one (Inventory number) will be used.
For display at the presentation site the inventory number could be shown in combination with the owner (e.g. a museum)
New entries will be created using the reference system Inventory number. Former notion in the description will be transformed over time by the Shahi time.
Because this feature (#1587) isn't implemented yet (but already on roadmap) we created the custom type Location to track this information for now.
Once the feature is implemented we will use a script to transform information tracked with this type to artifact -> place relations.
Updated information in the course of the meeting is in color. Every participant is invited to add and adapt.
This meeting is about importing thesaurus from ACDH-CH Vocabs into OpenAtlas (#1663). We invited our Vocabs expert Klaus to help clear up open questions.
Massimiliano will take a look at the vocabseditor but most likely we keep the current workflow (INDIGO providing the data and Massimiliano importing it to Vocabs).
At the moment in the INDIGO project identifier like 0080 are used, are there better alternatives?
Klaus suggested using camel case notation of the label in the URI when stable, like e.g https://vocabs.dariah.eu/tadirah/en/page/audioAnnotation.
Versioning is possible in theory but there are no good practical solution which could be used as examples currently. E.g. former versions could be archived in Arche and on Vocabs only the newest version is available. Problem with this solution would be for ones who still want to use an older version because they would have to update their systems to link to archived instance in Arche.
Additional suggestion from Klaus: don't use version numbers in URLs. Older version should be dumped with changelog and the new entries are linked to the old entries with close match.
Although collections/groups are good for a nice representation, they could be problematic for data entry/usage because they are not a concept and just containers. E.g. we have hierarchically groups, like Graffito Component but it shouldn't be used for linking in OpenAtlas. Klaus answered an in between question from Alex about broader/broadMatch: broadMatch is used for external references, broader for internal.
There would be multiple approaches to solve this, e.g. dropping the 2nd level (Concept Idea) in OpenAtlas and display all 3rd level entries like a flat list. Another option would be to use concepts as 2nd level entries.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting will be about the presentation site for CONNEC, a prototype is available here: https://discover-connec.openatlas.eu/
Introduction of Andi
Updated information in the course of the meeting is in color. Every participant is invited to add and adapt.
Some issues regarding the current design of the thesaurus were discussed, based on a previous meeting. More specifically, three topics were discussed and kindly documented by Massimiliano.
It is usual practice to structure SKOS vocabularies according to a hierarchy built with skos:broader and skos:narrower relationships, which are established between different concepts (i.e., instances of class skos:Concept).
The INDIGO thesaurus, instead, uses collections (i.e., instances of class skos:Collection) to represent its structure. Collections are also called “groups” in the interface of the Vocabs service. A collection is like a container that brings together different concepts that are related to each other for certain characteristics; a concept (or even a collection) can only be a member (property skos:member) of a certain collection (NOT skos:narrower! This property can only be used between concepts).
Therefore, the INDIGO thesaurus currently has a very flat hierarchy, where every concept is at the top level of the thesaurus (with the only exception of “First Line” and “Second Outline”, which are narrower than “Outline”).
The structure of the thesaurus is represented instead by means of collections. There are even collections that are members of other collections.
While this is formally and technically possible according to the SKOS specification, this kind of structure might render the thesaurus less understandable and usable for other services and users. For example, it is already creating issues with the import into OpenAtlas. Furthermore, it is difficult to manage such a vocabulary with the existing input tools, as also highlighted by Klaus on Mattermost. Therefore, we should consider if we want to restructure the vocabulary in order to create a SKOS hierarchy based on narrower-broader relationships. This would require rethinking the whole structure not only from a technical, but also (and foremost) from a semantic point of view.
Two important aspects to consider:All participants agree that it is best to use names of concepts and collections in identifiers instead of numbers.
For example, for concept “Commissioned Work”, we would change the identifier as follows:
(old) https://vocabs.acdh.oeaw.ac.at/indigo/0018
(new) https://vocabs.acdh.oeaw.ac.at/indigo/commissionedWork
provided that we are using the camelCase writing practice.
Jona asks if it is possible to compare different versions of the thesaurus, like in the diff view on GitHub. Massimiliano answers that, unfortunately, this is not possible in either the Vocabs service or ARCHE. The only way to track changes is to update the version number according to semantic versioning rules and keep a log in the metadata to the thesaurus (for example, by using the Dublin Core “description” property).
However, we could still think of storing the different thesaurus releases on GitHub, and automatically archive them into ARCHE (a similar workflow will be soon implemented for the ARCHE ontology itself). This way, it would be possible to compare different releases directly in GitHub, for example by following this method: https://docs.github.com/en/repositories/releasing-projects-on-github/comparing-releases
Asked by Jona and answered by Klaus.
Question
Would at be at least possible to track the changes like going back in the history/ seeing the implemented changes? I am thinking about something similar to the diff view in GitHub.
Answer
If we talk about versioning on side of Skosmos (the presentation layer on vocabs.acdh-dev.oeaw.ac.at) then the answer is simple that there is no versioning. we introduced a custom versioning workflow at ACDH-CH by adding the dumps of the vocabularies into the ARCHE repository. Visualizing differences is not implemented there, custom scripts may introduce this. generally we usually agree on a stable version of a vocabulary which is published on the public server. this version should be not in development (means: we avoid single changes on the public server). new versions on the public server should raise a version number and lead to upload of a new dump on ARCHE. there is no going back in history on side of Skosmos, but there is the possibility to see changes between versions, if the timestamps are correctly set (these timestamps need to be set individually per changed concept in the input tool)
Question
So, from a thesaurus/vocabs stand point, what would then be the best-practice in terms of collections/ group? Is it not considered helpful in structuring the thesaurus? Are there maybe good examples of the usage of collections and thesauri linked to databases and their structure?
How we proceed: Why is the "Concept Idea" not ok? Is it because it it to much similar to the term 'concept' which is used in SKOS? Or are you just mentioning it, because we could not use it in OpenAtlas as it would be a collection/ group there as well? Because, we actually do not want to use it as a type/ attribute to describe a graffito. We are aware, that we only can (and also want) to use concepts in OpenAtlas.
Answer
Formally, the use of collections/groups is correct and there are some examples where it is used parallel to a hierarchy. in terms of practicability it depends strongly on the input tool that you use. the reason, why most vocabularies only use the hierarchical view is that input fields of input tools are usually only capable to handle the hierarchical view. collections are more complicated, as they have terms that are not defined concepts and may additionally introduce complexity which select/combo boxes can't handle very good. for my projects i can only state, that i tell them that the input form is created based on the hierarchy and that the collections are ignored by the input tool. collections are just nice for the vocabulary presentation layer on Skosmos and may give some helpful background information for people that browse such a vocabulary.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting will be about the presentation site for CONNEC, a prototype is available here: https://discover-connec.openatlas.eu/
Discussing features on the Roadmap -> discussed features were documented as notes in the features themselves.
Check if solvedUpdated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about an introduction for entering data into OpenAtlas.
After users made themselves familiar with data entry and collected some questions and topics to discuss we can schedule a -> new meeting Meeting 2022 06 29
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about OpenAtlas and the project: "Approaching Byzantium in Ottoman Istanbul: the Reception of the Byzantine Heritage of Constantinople by Scholars from the Holy Roman Empire in the 16th century."
Topics discussed:Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This will be a shorter and more informal meeting before the summer and vacation time. Nevertheless we have created this protocol in case we want to document discussed topics.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This is the second part of the introduction for entering data into OpenAtlas for MAMEMS.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
We were able to discuss and either solve or plan all relevant issues. Nicholas will notify us when another meeting is needed.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about the (re)start of the general presentation site OpenAtlas Discovery which makes date entered into OpenAtlas accessible for a greater audience. The focus will be on used technologies.
Location: Bäckerstraße 13, Room 2051 (smaller meeting room at the start of the hallway) is reserved for us from 13.00 to 15.00
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The work and data flow for INDIGO is quite different. Because of the large amount of media data it will be stored directly in ARCHE. OpenAtlas will than be used to add and structure data. At the end the additional metadata should be of course again stored in ARCHE too. This meeting is about discussing and planning the communication between OpenAtlas and ARCHE
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, Room 2036 (meeting room next to the kitchen) is reserved for us from 15.00 to 17.00
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about data entry workflows for the MAMEMS and the Approaching Byzantium project.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, Room 2036 (meeting room next to the kitchen) is reserved for us from 15.00 to 17.00
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, from 13.00 to 17.00, Room D2036 (2nd floor near kitchen)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
After Andi did a great job redesigned our OpenAtlas webpage, we have a workshop to discuss details and implement changes in a live coding session.
Besides than having a new webpage we can put online it's also a good opportunity for a hands on session to play with our standard technologies (Flask, Python, JavaScript) in a more accessible setting (e.g. no database or user management).
Demo: https://dev.openatlas.eu/,
Code branch on GitHub: https://github.com/craws/OpenAtlas-Website/tree/feature_new_ui
New website is online (https://openatlas.eu), additional ideas were noted in #1992
Location: Bäckerstraße 13, from 15.00 to 17.00, Room 2036 (meeting room next to the kitchen)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about the presentation sites for the MAMEMS and the Approaching Byzantium project.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, Room 2036 (meeting room next to the kitchen) is reserved for us from 15.00 to 17.00
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, Room 2036 (meeting room next to the kitchen) is reserved for us from 13:00 on
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting is about starting data entry for the FemCareVienna project at https://femcarevienna.openatlas.eu.
Location: Bäckerstraße 13, Room 2036 (meeting room next to the kitchen) is reserved for us from 15.00 to 17.00
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
In this meeting we will discuss questions from the MAMEMS team about entering data in OpenAtlas.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Name/ Alias
(Vanessa) Where should the Ottoman Turkish, transliteration into Latin letters, and the English equivalent be written since they are not alias? Likewise, in which section should the patronyms be included, i.e., “بن, bin/ibn, son of.”
Is there a standardised list of spellings for peoples’ names?
This questions and similar, e.g. using British or American English will have to be discussed within the project team.
Dates
(Vanessa) Should Islamic dates (ḥicrī) as well as Christian dates (milādī, rūmī)? If so, which should be written first.
Place
(Vanessa) Is there a list of standardised toponyms, since more than one name is attributed to some places? For example, the Ottoman capital:
Ottoman Turkish: قسطنطينية, اسلابول, اسطنبول
Transliteration: Ḳosṭanṭiniyye, İslāmbūl, İsṭanbūl
Modern Turkish: İstanbul
English: Constantinople, Istanbul
Ottoman Turkish Alias: آستانه همايونĀsitāne-i Hümāyūn (Imperial Threshold), دارالخلافة Dārü’l-ḫilāfe (Abode of the Caliph), در سعادت Dersaʿādet (Abode of Felicity), etc.
This questions and similar, e.g. using British or American English will have to be discussed within the project team.
The meeting will be about data entry for the MAMEMS and the Approaching Byzantium project.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: ACDH-CH, Bäckerstraße 13, room 2D (the big one next to the kitchen in 2nd floor)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting will be about the presentation site to make the results of the Approaching Byzantium project available to a broader audience.
Information from previous discussionUpdated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
We like the idea because it conveys the focus of the project at a first glance and created an issue for it (#2165).
Location: ACDH-CH, Bäckerstraße 13, room 2D (the big one next to the kitchen in 2nd floor)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Bäckerstraße 13, Room 3D is reserved for us from 11:00 on
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about data entry for the MAMEMS and the Approaching Byzantium project.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Following questions should be discussed within the research team(s) because they are more about content than technical aspects of OpenAtlas.
We discussed these at the MAMEMS meeting at 28. February and here is the feedback from Zachary:
Location: ACDH-CH, Bäckerstraße 13, room 3D (the big one next to the kitchen in 3rd floor)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Is there a need for a special frontend format or endpoints? Specifications?
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
There are some new questions by Nicholas about data entry.
The meeting will be about the presentation site for MAMEMS and showing how to adapt project specific content.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
New OpenAtlas Discovery instance for MAMEMS was shown, the CMS was presented, tested and liked.
Location: ACDH-CH, Bäckerstraße 13, room 3D (the big one next to the kitchen in 3rd floor)
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
Location: Online
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
This meeting was about question regarding entering data. A PDF from Lara with well prepared questions (in German) including screenshots is attached.
This allows for creating different vocabularies for each of these classes, which of coursed can shared between them were needed.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about the presentation site for Approaching Byzantium: https://approaching-byzantium.openatlas.eu/.
Location: ACDH-CH, Bäckerstraße 13, 2D (the big one next to the kitchen in 2nd floor)
Updated information in the course of the meeting is in color and/or marked with an ✅. Every participant is welcome to add and adapt.
Participants: Bernhard Koschicek, Alexander Watzinger, Stefan Eichert, Christoph Hoffmann, Nina Brundke
Location: ACDH-CH, Alte Burse, Vienna
Collection of topics for upcoming meetings. Feel free to add your own topics, ideas or questions.
A collection of ideas that haven't made it into an own ticket yet.
f'{_("welcome")} {username}'
Functionality
CIDOC CRM is used as basis for the underlying data model of OpenAtlas. Currently we are using CIDOC CRM v7.1.1 from http://www.cidoc-crm.org/versions-of-the-cidoc-crm
A script is used to parse the specification and import it into a PostgreSQL database (https://github.com/craws/OpenAtlas/tree/main/install)
OpenAtlas "knows" the specification and checks every link that it is made with the application. There is also a function to check all existing links which can be useful e.g. after an import of external data.
We don't import or use inverse properties (ending with i) since our links are directed anyway. Nevertheless their labels are imported for more convenient viewing of relations.
Properties with URLs as domain/range are ignored because our system has a foreign key on domain which must match an existing class.
Also some properties are linked as sub_properties_of properties with the i suffix. Since we don't use inverse properties in the database (direction is determined through domain/range selection)they are linked to their counterpart without i.
There are some "special" properties we ignore, e.g. P3, you can look them up in the OpenAtlas CIDOC parser script where they are defined at the top: https://github.com/craws/OpenAtlas/blob/main/install/crm/cidoc_rtfs_parser.py
We don't import them because of technical reasons, e.g. they are missing some definitions that "normal" properties have and the import script would have troubles to deal with them. E.g. they have no defined range but this is a foreign key in our database that can't be empty.
We try as long as possible to only use the official CIDOC CRM and to not introduce own classes or use extensions, which we managed so far.
But we are using some shortcuts in the software/database for performance and to keep the code base maintainable.
More details: CIDOC CRM shortcuts.
OpenAtlas uses the CIDOC CRM in the application but because of contextual differences we needed a more fine grained model for the user interface.
E.g. E33 Linguistic Object can be a source or a source translation which have different forms in different context.
An overview of internal mapping and CIDOC CRM classes can be found here
Here you can see a simplified version of the model used in OpenAtlas which is based on classes and properties of the CIDOC CRM.
Updated information in the course of the meeting is in color. Every participant is welcome to add and adapt.
The meeting will be about the presentation site and the API used to retrieve the relevant information.
These are the archived notes for the OpenAtlas poster implemented by our graphic designer Jan (#2015).
There should be 3 to 5 sections about important core features. We first collect these (feel free to add) and weight, refine content, ... them later.
Newest strategy would be to use attributes, see below
The data model specifies the structure in which the information is stored within the database. The use of an ontology, for example, allows the data to be combined more easily with information from other projects and is consistent with the FAIR principles. The OpenAtlas Model is based on the international standard of CIDOC CRM, an ontology widely used within the field of humanities.
Places with known location can be entered into an interactive map based on Leaflet, which features different view layers, allows for zooming, fullscreen mode, clustering, searching and much more. PostGIS is used for creating and manipulating spatial data. Therefore, it is possible to enter location as needed as multiple points, lines, areas and shapes.
Person and Groups with their biographical information can be connected with different properties and in hierarchically order, forming a complex network of actors. Actors can act through different kinds of events (e.g. Activity, Production, Acquisition, ...) forming their life story and how they are connected to other Actors and also Places and Artifacts.
Not all endpoints support all parameters. Also, some endpoints has additional unique parameter options, which are described at their section.
path/parameter | type_id | format | page | sort | column | limit | filter/search | first | last | show | count | download | lang | geometry | image_size |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
entity | x | x | x | ||||||||||||
code | x | x | x | x | x | x | x | x | x | x | x | x | |||
system_class | x | x | x | x | x | x | x | x | x | x | x | x | |||
entities_linked_to_entity | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities | x | x | x | x | x | x | x | x | x | x | x | x | |||
type_entities_all | x | x | x | x | x | x | x | x | x | x | x | x | |||
class | x | x | x | x | x | x | x | x | x | x | x | x | |||
latest | x | x | x | x | x | x | x | x | x | x | x | x | |||
query | x | x | x | x | x | x | x | x | x | x | x | x | |||
node_entities | x | x | |||||||||||||
node_entities_all | x | x | |||||||||||||
subunit | x | x | |||||||||||||
subunit_hierarchy | x | x | |||||||||||||
type_tree | x | x | |||||||||||||
node_overview | x | x | |||||||||||||
geometric_entities | x | x | x | ||||||||||||
content | x | x | |||||||||||||
classes | |||||||||||||||
system_class_count | |||||||||||||||
display | x |
<'asc', 'desc'>
?sort=<'asc','desc'>
If multiple sort parameter are used, the first valid sort input will be used.
It does not matter if the words are uppercase or lowercase (i.e. DeSc or aSC), but the query only takes asc or desc as valid input. If no valid input is provided, the result is orders ASC.
<'id', 'class_code', 'name', 'description', 'created', 'modified', 'system_type', 'begin_from', 'begin_to', 'end_from', 'end_to'>
The column parameter declares which columns in the table are sorted with the sort parameter.
?column=<'id', 'class_code', 'name', 'description', 'created', 'modified', 'system_type', 'begin_from', 'begin_to', 'end_from', 'end_to'>
If multiple column parameter are used, a list is created, by the order in which the parameters are given (i.e. ?column=name&column=description&column=id will order by name, description and id).
It does not matter if the words are uppercase or lowercase (i.e. Name, ID, DeScrIPtioN or Class_Code). If no valid input is provided, the results are ordered by name.
<number>
The limit parameter declares how many results will returned.
?limit=<number>
If multiple limit parameter are used, the first valid limit input will be used. Limit only take positive numbers.
The search parameter provides a tool to filter and search the data with logical operators.
Search parameter:
?search={}
Logical operators:
These are not mandatory. or is the standard value.
and, or
Compare operators:
equal, notEqual, greaterThan*, greaterThanEqual*, lesserThan*, lesserThanEqual* (* works only in combination with beginFrom, beginTo, endFrom, endTo)
Filterable categories:
entityName, entityDescription, entityAliases, entityCidocClass, entitySystemClass, entityID, typeID, typeName, typeDescription, valueTypeID, valueTypeName, beginFrom, beginTo, endFrom, endTo
The search parameter takes a JSON as value. A key has to be a filterable category followed by a list/array. This list need to have again JSON values as items. Dates are always strings and needs the YYYY-MM-DD format (0852-01-03). There can be multiple search parameters. E.g:
?search={"valueTypeID":[{"operator":"equal","values":[123456]}], "typeName":[{"operator":"notEqual","values":["Chain", "Bracelet"],"logicalOperator":"and"}], "endFrom":[{"operator":"greaterThanEqual","values":["0830-01-01"]}]}&search={"valueTypeName":[{"operator":"equal","values":["Gold"]}]}
Every JSON in a search parameter field is logical connected with AND. E.g:
?search={A:[{X}, {Y}], B: [M]} => Entities containing A(X and Y) and B(M)
Each search parameter is logical connected with OR. E.g:
?search={A:[{X}, {Y}]}&search={A:[{M}]} => Entities containing A(X and Y) or A(M)
Within the list of a key, there are multiple queries possible. A query contains a compare operator, the values to be searched and a logical operator, how the values should be handled. E.g:
{"operator":"equal","values":[123456],"logicalOperator":"or"} {"operator":"notEqual","values":["string", "otherString"],"logicalOperator":"and"} {"operator":"greaterThanEqual","values":["0830-01-01"]}
With the example above, we can textualize the outcome:
?search={"valueTypeID":[{"operator":"equal","values":[123456],"logicalOperator":"or"}, "typeName":[{"operator":"notEqual","values":["Chain", "Burial object"],"logicalOperator":"and"]}&search={"valueTypeName":[{"operator":"equal","values":["Gold"],"logicalOperator":"or"}]}
Get entities which has the valueTypeID 123456 AND NOT the types called "Chain" AND "Burial object", OR all entities which has the valueTypeName "Gold".
<=, !=, <, <=, >, >=, LIKE, IN, AND, OR, AND NOT, OR NOT>
The filter parameter is used to specify which entries should return.
?filter=<XXX>
Please note, that the filter values will translate directly in SQL. For example:
?filter=and|name|like|Ach&filter=or|id|gt|5432
AND e.name LIKE %%Ach%% OR e.id > 5432
?filter=or|id|gt|150&filter=anot|id|ne|200 ?filter=and|name|like|Ach
first=<id> OR last=<id> OR page=<int>
The page parameter will take any number as page number and provides the entities of this page.
The first parameter takes IDs and will show every entity after and including the named ID.
The last parameter takes IDs and will show every entity after the named ID.
?page=<int>
?first=<id>
?last=<id>
Page, first and last will only take numbers. First and last has to be a valid ID. The table will be sorted AND filtered before the pagination comes in place.
?page=4 ?last=220 ?first=219
<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
The show parameter will take in the key values of a json. If no value is given, every key will be filled. If a value is given, it only will show the types which are committed. If the parameter contains none, no additional keys/values will be shown
?show=<'when', 'types', 'relations', 'names', 'links', 'geometry', 'depictions', 'none'>
For each value, a new parameter has to be set. The value will be matched against a list of keywords, so wrong input will be ignored.
?show=when ?show=types ?show=types&show=when ?show=none
lp, geojson, pretty-xml, n3, turtle, nt, xml
With the format parameter, the output format of an entity representation can be selected. lp stands for Linked Places Format, which is the standard selection. For information on other formats, please confer API Output Formats
?format=<lp, geojson, pretty-xml, n3, turtle, nt, xml>
Only the last format parameter counts as valid input. This parameter is not case-sensitive.
?format=lp ?format=geojson ?show=n3
<int>
The whole search query will be filtered by this Type ID. Multiple type_id parameters are valid and are connected with a logical OR connection.
?type_id=<id>
type_id only takes a valid type ID.
?type_id=<int>
<>
Returns a json with a number of the total count of the included entities.
?count
Only count will trigger the function. Count can have any numbers assigned to it, which makes no difference.
?count
<>
Will trigger the download of the result of the request path.
?download
Only download will trigger the function. Download can have anything assigned to it, but this will be discarded.
?download
<'en', 'de'>
Select the language, which content will be displayed.
?lang
Default value is None, which means the default language of the OpenAtlas instance is taken.
?lang ?lang=en ?lang=DE
gisAll, gisPointAll, gisPointSupers, gisPointSubs, gisPointSibling, gisLineAll, gisPolygonAll
Filter, which geometric entities will be retrieved through /geometric_entities. Multiple geometry parameters are valid. Be aware, this parameter is case-sensitive!
?geometry
The default value is gisAll. Be aware, this parameter is case-sensitive!
?geometry=gisPointSupers ?geometry=gisPolygonAll
%{color:green}your edited text%
2024-05-08 Developer meeting
no-date OpenAtlas-Discovery API Meeting 2024-04
2024-04-10 Approaching Byzantium
2024-03-27 Welterbe Salzkammergut
2024-03-21 Developer meeting
2024-03-20 MAMEMS
2024-03-13 Approaching Byzantium
2024-02-21 Frontend API meeting
2024-02-02 MAMEMS/Approaching Byzantium
2024-01-31 Developer meeting
2024-01-31 FemCare Vienna
2024-01-23 Approaching Byzantium
2023-12-06 Approaching Byzantium
2023-11-23 Developer meeting
2023-11-21 Approaching Byzantium
2023-11-10 MAMEMS/Approaching Byzantium
2023-10-11 MAMEMS data entry meeting
2023-09-13 Developer meeting
2023-07-01 FemCare Vienna
2023-05-17 Developer meeting
2023-05-04 MAMEMS/Approaching Byzantium
2023-04-04 Developer meeting
2023-03-22 OpenAtlas website workshop
2023-02-14 Approaching Byzantium
2023-02-01 Developer meeting
2023-01-26 MAMEMS/Approaching Byzantium
2022-12-19 Approaching Byzantium
2022-12-06 Developer meeting
2022-11-18 Approaching Byzantium
2022-10-24 Shahi
2022-10-19 OpenAtlas and ARCHE for INDIGO
2022-10-19 OpenAtlas Discovery
2022-10-05 Developer meeting
2022-09-29 Approaching Byzantium
2022-06-29 MAMEMS
2022-06-28 Developer meeting
2022-06-13 Approaching Byzantium
2022-06-08 MAMEMS
2022-05-30 CONNEC
2022-05-04 Developer meeting
2022-04-22 INDIGO/Vocabs
2022-04-19 CONNEC
2022-04-05 INDIGO/Vocabs
2022-04-01 Shahi
2022-03-31 INDIGO/Vocabs
2022-03-04 UI map
2022-03-01 INDIGO
2022-02-14 Developer meeting
2022-02-02 CONNEC
2022-01-25 HIPHILS
2022-01-20 UI map
2022-01-17 Developer meeting
2022-01-13 CONNEC
2022-01-13 ARCHE
2021-12-21 MAMEMS/Approaching Byzantium
2021-12-09 CONNEC
2021-12-08 MAMEMS/Approaching Byzantium
2021-12-07 INDIGO
2021-11-03 Developer meeting
2021-10-15 CONNEC
2021-10-07 Approaching Byzantium
2021-10-05 INDIGO
2021-09-28 Shahi
2021-07-27 Frontend/API
2021-07-08 Kubernetes
2021-06-14 Developer meeting
2021-05-10 Developer meeting
2021-03-19 Hackathon Retreat
2021-02-17 Frontend/API workshop
2021-01-28 Frontend/API
2020-09-28 Strategic meeting
2020-09-08 API meets Frontend
2020-06-17 Hackathon
2020-06-15 API
2020-05-19 Frontend
2020-05-06 Developer meeting
2020-03-03 Frontend
2019-11-27 Developer meeting
2019-10-28 Developer meeting
2019-09-19 Developer meeting
Here is a list of ideas for project topics (#2170), feel free to add/adapt
Either get it from PyCharm directly or use these instructions for Debian.
Under menu item File > Settings when using PyCharm IDE
These are some personal preferences of Alex which don't have any effect on submitted code.
In order to record which part of the document contains the respective reference a delimiter, namely a certain value to determine the position in the reference, is stored along with the link between the entities. This can be page numbers of a book, chapters, figure numbers etc.
E.g.: E31 Document "Book of Kells" - P67 "refers to" - E21 Person "Saint Mary" at delimiter: "Folio 7v"
The link between an entity and the Authority Document would be stored in the model.link table in the following way
This combination of E32 and delimiter could furthermore be resolved as E31 "Document", as it is a unique reference documenting the entity while the E32 alone is the container for all possible references from this authority document.
E21 Person "Terry Prattchet" - is referred to by (P67) - E31 document (https://www.wikidata.org/wiki/Q46248) - P71 is listed in - E32 Authority Document (https://www.wikidata.org/wiki/)
Various entities can be connected to files. This is mapped as E31 (Document = file) refers to (P67) -> E1. In our case files can refer to any of the "top level" entities.
Files can (but need not necessarily) be images. Files are stored with a certain system type (i.e. file). If the file is an image, this is most probably a depiction of the entity.
A file can also have a further reference. E.g. the source where the file comes from. This is for example the bibliographical reference to the publication where a file (e.g. a scanned image) is extracted from. In this case there is a link between a document E31 with a type "Bibliography" (or sub type), via P67 to another document E31 with a system type "file". In this case the file is not the depiction of the reference but the reference is the origin of the file. This is mostly needed to document the copyright resp. right holder or source of the file.
pybabel extract -F openatlas/translations/babel.cfg -k lazy_gettext -o openatlas/translations/messages.pot . pybabel update -i openatlas/translations/messages.pot -d openatlas/translations
sphinx-build ./sphinx/source/ openatlas/static/manual
pg_dump openatlas_demo > /var/lib/postgresql/openatlas_demo.sql pg_dump openatlas_demo_dev > /var/lib/postgresql/openatlas_demo_dev.sql
git pull origin main python3 install/upgrade/database_upgrade.py
Settlement: permanent Settlement awan (Stadt, Dorf, Siedlungsplatz) selište – селиште, селище (Plansiedlung) selište – селиште, селище (Wüstung) rural Settlement: Dorf gewł, geoł (Dorf) gewłak´ałak´ (befestigtes Dorf/Siedlung), selo – село (Dorf) villa, ad XY (Dorf, Siedlung) villula, locus, (kleines Dorf) castrum (befestigtes Dorf) Weiler zaselĭkĭ – заселькь (Weiler) Gehöft agarak (Gehöft, Landgut) curtis (Gehöft, Landgut) gawit´ (Hof), dastakert (Landgut, Besitztum) manus/a/um, colonius/a (Abhängige Hofstelle) Casa, Domus, Aedificium (Haus als Teil eines Besitzes) urban settlement gradŭ – градъ (Stadt, Oberstadt) mayrak´ałak´ (Mutterstadt, Metropolis) befestigte Stadt: k´ałak´ (befestige Stadt) urbs, oppidum, civitas (Stadt) temporary Settlement: Refugium comitatus (mobiler Hof des Königs) Feldlager banak (Feldlager) Saisonale Sieldung katunŭ – катоунъ (temporäre nomadische Siedlung) Alm Military Facility: amroc´ (Festung, Befestigung), berd (Festung, Burg), mijnaberd (Zitadelle, Akropolis) gradŭ – градъ (Festung, Burg) castrum (Burg, mehrfache Bedeutung d. Wortes) Clusura, Claustra, Clusa (Talsperre) , Ritual Site Church: ekełeci (Kirche), martyrium, vkayanoc´ (Kirche eines Apostels o. wichtigen Märtyrers) crĭkva – црьква (Kirche) crŭkŭvište – цръкъвиште (verfallene Kirche) basilica (tolle Kirche) ecclesia (Kirche) parva ecclesia (kleine Kirche) Monastery: vank´ (Kloster), mainjin (Eremitage, Gemeinschaft von Eremiten, „Laura“), kusastan (Frauenkloster) manastyrĭ – манастъірь (Kloster) metohŭ – метохъ; metohija – метохиꙗ (Metochion) Monasterium (Kloster) monasterium puellarum (Frauenkloster) cella, cellula (kleines Kloster) See of Bishopric: kat´ołikē (Kathedrale, Hauptkirche) Mosque Temple: atrušan (Feuertempel, Zoroastrismus), bagin (heidnischer Altar oder Schrein), mehean (Mithrastempel, heidnischer Schrein) Synagogue Burial Site Cemetery Inhumation Cemetery Biritual Cemetery Cremation Cemetery Churchyard Infrastructure Traffic Watercrossing Ford Ferry Bridge Landing Places Way Main Way Sideway Pass Stationen Abgabestation (Miete für Tretboot/Maut/Zoll...) Raststation Hospitium Economic Site Agricultural Site Acker Weide Sommerweide Winterweide Gärten Weingarten Forst Fischrevier Jagdrevier partēz, draχt (Garten, Obstgarten), mazri, p´ayt (Wald, Holz), ang, art (Feld) Meadows: carak (Weide) Vineyard: gini, aygestan, aygi (Wein, Weingarten) planina – планина (Sommerweide) zimovište – зимовиште (Winterweide) paša, pašište – паша, пашиште (Weide, Weideland) zabělĭ – забѣль (Weide, Hain) niva – нива (Acker) kupljenica – коуплѥница (gekaufter Acker) polje – полѥ (Feld) vinogradŭ – виноградъ (Weinberg, Weingarten) alpis, alpestris (Alm) agrum, XX hobas de terra arabili (Acker) pratum, pascua, campus (Wiese, Weideland) silva (Wald/weide) vineum (Weingarten => Symbol of Power) piscatio (Fischereirecht) venatio (Jagdrecht => Symbol of Power) Industrial Site Rohstoffgewinnung Salz Salz: ałtk (Salzlager), ad sal coquendum, patella, glago (Salzkochstelle) Metall Eisengewinnung Erz: erkat´ (Eisen); aruzi (Erz) Kupfergewinnung Weißmetallgewinnung Edelmetall arčat (Silber, Gold), Verarbeitung/Produktion Mühle voděnica – водѣница (Wassermühle) Schmiede Trading Sites Markt Market: vačar (Markt); trŭgŭ – тръгъ (Marktplatz, Markt) panagjurŭ – панагюръ (Jahrmarkt) forum, emporium (Markt) Boundary Mark Grenzpunkt Grenzlinie Grenzstreifen Grenzraum (genau bezeichnete Grenzpunkte wie zB Bäume, Flusseinmündungen etc.) termina, fines (Grenzraum) comitatus, marca (Grenzherrschaft) kraište – краиште (Grenzregion, Militärgrenze, Grenze) Topographical Entities Waterbody Lake River Sea Swamp Mountain gora – гора; brŭdo – бръдо (Berg) mons, montes (Berg, Gebirge) collis (Hügel) Valley Forest Urwald?
Below are some database related code snippets used in development.
It is possible (for users with the role admin or manager) to make SQL dumps via the backend anytime: admin -> data -> SQL export. Also, every time the database_upgrade.py script, which is used at an upgrade to a new OpenAtlas version, is called before doing database changes. These backups are located at files/export and because the date of creation is included in the filename it is easy to see when they were made. Of course it is good practice to store these SQL backups on another machine too, e.g. in case of a hard disk failure, but this is outside the scope of OpenAtlas.
In case these backups are needed, lets say for example an import went wrong, the workflow on a Debian machine would be as follows. The commands are executed as postgres user and it is assumed the database and user are called openatlas.
dropdb openatlas
createdb openatlas -O openatlas
psql openatlas < path/to/the/unpacked/backup.sql
Export the database structure from the test database (used to avoid specialties in production databases) into install/1_structure.sql
pg_dump -sc --if-exists -n model -n gis -n log -n web -n import openatlas_test > install/1_structure.sql
pg_dump openatlas --rows-per-insert 10000 -a -t model.cidoc_class -t model.cidoc_class_i18n -t model.cidoc_class_inheritance -t model.property -t model.property_i18n -t model.property_inheritance > install/2_data_model.sql
pg_dump -n web openatlas > /tmp/openatlas_web.sql pg_dump -n model openatlas > /tmp/openatlas_web.sql
There are mainly two ways of restoring a PostgreSQL database on Windows. For clarification of the commands and options, please read the pg_dump and pg_restore documentation.
Export on Debian server as postgres
user with following command:
pg_dump -Fc openatlas > openatlas.dump
This dump file can be used with pgAdmin4 to restore the database or with the following command in PowerShell (git bash doesn't work). Please adjust variables for your need:
pg_restore.exe --host "localhost" --port "5432" --username "openatlas" --dbname "openatlas" --clean --verbose "path to dump file"
Export on Debian server as postgres
user with following command:
pg_dump --attribute-inserts openatlas > openatlas.sql
/var/lib/postgresql/reset_demo.sh
Replace schema.table below accordingly and execute:
CREATE TRIGGER update_modified BEFORE UPDATE ON schema.table FOR EACH ROW EXECUTE PROCEDURE model.update_modified();
e.g. after a case study separation (not sure if this statement is now showing really missing or just needed locations)
SELECT e.id, e.name, e.system_type FROM model.entity e
WHERE e.class_code = 'E53' AND e.system_type = 'place location' AND e.id IN (
SELECT r.id FROM model.link lr JOIN model.entity r ON lr.range_id = r.id AND lr.property_code IN ('P74', 'OA8', 'OA9', 'P7'))
AND e.id NOT IN (SELECT range_id FROM model.link WHERE property_code = 'P53');
DELETE FROM model.entity WHERE id IN (
SELECT id FROM model.entity WHERE system_type = 'place location' AND id NOT IN (
SELECT e.id FROM model.entity e JOIN model.link l ON e.id = l.range_id AND l.property_code = 'P53'
WHERE e.class_code = 'E53' AND e.system_type = 'place location'));
To get all child events of a given event the SQL below (replace ROOT_EVENT_ID at bottom). It works but is slow and could be improved.
WITH RECURSIVE tree AS (
SELECT e.id, ARRAY[]::INTEGER[] AS ancestors
FROM model.entity e
WHERE (SELECT s.id FROM model.entity s JOIN model.link l ON s.id = l.range_id AND l.domain_id = e.id AND l.property_id = (SELECT id FROM model.property WHERE code = 'P117')) IS NULL
UNION ALL
SELECT e.id, tree.ancestors ||
(SELECT s.id FROM model.entity s JOIN model.link l ON s.id = l.range_id AND l.domain_id = e.id AND l.property_id = (SELECT id FROM model.property WHERE code = 'P117'))
FROM model.entity e, tree
WHERE (SELECT s.id FROM model.entity s JOIN model.link l ON s.id = l.range_id AND l.domain_id = e.id AND l.property_id = (SELECT id FROM model.property WHERE code = 'P117')) = tree.id
)
SELECT * FROM tree WHERE ROOT_EVENT_ID = ANY(tree.ancestors);
#1064 In conception phase, draft by Stefan:
Due to the increasing complexity and additional features regarding written sources and their interconnections we need to reorganise the respective UI.
It should be easy to distinguish between:
Core
Entities referred to by the content
Entities involved in creation of the source
Linking between event entities from the content
Further links
One thing more
Whitepaper Geometries
Demo: http://homepage.univie.ac.at/stefan.eichert/shapedraw/
I. Physical things like buildings, settlements, regions, areas etc. that have or originally had a position in space and a certain extend
II. Roads/routes/rivers that have or originally had a position in space and a certain extend
III. (Find-)Spots with no spatial extend that only have point coordinates.
case 1: The extend is known and can be drawn as a polygon that represents the extend=shape of the physical thing
E.g. the shape of a building or the area of excavation or the area of a settlement that can be drawn for example from an aerial photograph or a map.
case 2: The extend is not known but known to be within a larger area with known extend that can be drawn as a polygon.
E.g a no longer existing settlement that is known to have been situated within a known area for example in a valley between two other known settlements.
case 3: The extend is not known but known to be within a larger predefined area with known extend that is already in the database.
E.g an archeological findspot of unknown position that is known to have been situated inside the boundaries of a certain administrative unit.
case 4: The extend is not known but known to be within a larger area with unknown extend that cannot be drawn as a polygon
E.g. a no longer existing settlement that is known to have been situated within the historical boundaries of a no longer existing county.
case 5: There is no extend but only a known centerpoint
E.g. the coordinates where one find has been found.
case 6: Neither the extend nor a vague position within a reasonable larger area are known.
In many cases the exact identification of physical things/places mentioned in sources (e.g. in charters or also in archaeological publications) with one certain and still existing physcial entity and its extend is not possible. Therefore it is necessary to allow multiple possibilites to record possible locations of physical things:
A charter for example may mention one church and today two still existing churches might be identified with the one mentioned. In this case the church-entity from the source should be linked to two possible spatial objects. Here they would be two polygons, representing the extend of the respective church. However, any combination of the above mentioned cases must be technically and conceptually possible in any number. In theory it must be possible to link e.g. a castle known from a charter to the very extend of a still existing castle and at the same time to a vanished castle that is known to have been located within a certain area and also to another possible location within a certain administrative unit etc. etc.
Each entity with spatial position will be represented on the map at least with a marker as point. If polygon data is available these polygons will be shown too.
We want to offer the possibility to define the spatial position in any combination of the aforementioned categories. Therefore the location of the physical entity is connected 1-n to one or more entries in the gis-tables
In the map interface:
1.)
The user should be able to draw polygons or points to define either the extend of the physical entity or an area in which the physical thing is situated.
Also predefined categories should be chosen to define whether it is extend or area.
These polygons should be editable and deletable.
Methods: leaflet draw polygon and postgis
2.)
The user should be able to define an already existing area in which the site is located
2a) Administrative Unit: with known extend
2b) Historical Region: with unknown extend
3.)
The user should be able to set markers to define a point of location. In this case a point is drawn and saved to gis_centerpoints. No polygon is drawn.
Postgis to/from Leaflet: see Stefan's drawshapes.js
1. get geojson from existing polygons and show them in the map.
2. Show centerpoints in Map
3. Make polygons from vertices drawn in leaflet and saves new ones and edited ones to the db
4. delete Polygons from leaflet and db
Triggers/Warnings:
5. Create/update centerpoint data automatically after polygon is drawn or edited using postgis ST_PointOnSurface
SELECT ST_AsText(ST_PointOnSurface((SELECT ST_AsText (geom) FROM openatlas.polygon WHERE WHERE id = value of id)::geometry))insert or replace result into gis_centerpoints
6. Warnings if Point data is updated and moved outside of polygon using postgis ST_Intersects
SELECT ST_Intersects(ST_GeomFromText('POINT(value of new X, value of new Y)', 4326), (SELECT geom FROM openatlas.polygon WHERE id = value of id))if true: all is good. if false: alert and do not allow to draw a new centerpoint.
to be extended... on the fly
Model: https://mapfig.org/map/
Datainput:
offer workflow
After new Site form is filled: offer button for localisation (next to save).
If pushed: save site and open Map-Tab
1.
select type of localisation (see 2.)
2.
if 1 - start polygon draw-tool in map and open form (type is automatically "shape")
if 2 - start polygon draw-tool in map and open form (type is automatically "area")
if 3 - offer selection tool for administrative units
if 4 - offer selection tool for historical regions
https://www.jstree.com/ for 3 and 4
if 5 - start markerdraw (as it is by now)
3.
after 1-3: save to db and trigger the creation of a centerpoint
after 4: if historical region has a geometry trigger the creation of a centerpoint else don't
after 5: nothing necessary
Map: use existing map Interface but remove buttons.
after one localisation is done: offer the possibility to add one more (1-n)
Sphinx is used to generate the in-application manual. The content of the manual can be changed in the .rst (Restructured text) files at sphinx/source.
It is important that the Sphinx version 5.3.0 is used to prevent overwriting each other and running into formatting problems. E.g. if installing with pip:
pip3 install sphinx==5.3.0 sphinx-rtd-theme
After finishing changes, it is a good practice to delete the openatlas/static/manual directory and generate the whole manual again. Otherwise, not all changes (e.g. menu structure) may be propagated correctly.
To avoid issues with different versions and operating systems following workflow is advised:
git checkout -b feature_manual_new_topic
sphinx-build ./sphinx/source/ openatlas/static/manual
rm -R openatlas/static/manual/
git checkout develop openatlas/static/manual/
git status
git add .
git commit -m "Nice commit message"
git checkout develop
git merge feature_manual_new_topic
git branch -d feature_manual_new_topic
apt install python3-sphinx python3-sphinx-rtd-theme
OpenAtlas is developed with some standards in mind.
Of course we do our best to avoid them in the first place but if bugs get apparent they always have the highest priority.
Version | Setup | Configuration | ||
---|---|---|---|---|
Eslint | 8.35.0 | Through npm in relevant projects | .eslintrc | Checks for errors, coding standards and code smells in js/ts code |
Version | PyCharm | Configuration | ||
---|---|---|---|---|
PEP 8 | integrated | Style guide for Python code | ||
nose tests | 1.3.7 | integrated | tests/.noserc tests/.coveragerc |
Cover Python code using assertions, some parts (e.g. mail) are excluded via # pragma: no cover. Can be run manually with: nosetests3 -c tests/.noserc |
Mypy | 1.01 | plugin Mypy | mypy.ini | Optional static type checker for Python |
Pylint | 2.16.2 | plugin Pylint | .pylintrc | Checks for errors, coding standards and code smells |
SonarQube | 5 | sonar-project.properties | Cumbersome to install (so not mandatory for developers) but interesting code hints. PyCharm plugin exists but doesn't seem to work for Python |
Radon can be used to compute various code metrics. Installation: apt install radon. To show e.g. metrics about cyclic complexity: radon cc openatlas
Although pip offers some advantages (e.g more current packages and usable for different installation environments) we prefer packages from the Debian repository for the OpenAtlas backend.
Main reasons concern security, reliability and stability.
Here we collect topics to explore when starting to develop a presentation site for a specific project.
Issue #2079
Text annotation has been on our radar for some time now so I created an issues to discuss how we can proceed.
Basically it's about linking entities (actors, places, ...) to specific parts of a text, instead of just linking them to the whole text like it is possible currently.
We need a tool for users to annotate. Although difficult, text changes and annotations should be doable in one form element.
AdvantagesWe will save the information in an extra database table. Draft for fields:
In case we implement it in the course of a cooperation with ENCHANT the time frame will be for 2024: a working basic implementation in summer and a more complete version at the end of the year.
Format draft GeoJSON
{ "158275":[ { "id":158275, "parentId":0, "rootId":158275, "openatlasClassName":"place", "crmClass":"E18", "created":"2021-03-17 14:28:58.370691", "modified":"2021-07-21 07:40:08.918682", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"Point", "coordinates":[ 15.61921, 48.30621 ], "title":"Absdorf", "description":"Absdorf (2783054), imported from GeoNames" }, "children":[ 158277, 158281, 158295 ], "properties":{ "name":"Absdorf Statzendorf", "aliases":null, "description":"In 1933, three inhumation burials were found on plot 31in Absdorf (Statzendorf). They were retrieved for the Niederösterreichisches Landesmuseum.", "standardType":{ "name":"Inhumation Cemetery", "id":22378, "rootId":73, "path":"Cemetery > Burial Site > Place" }, "timespan":{ "earliestBegin":"0700-01-01", "latestBegin":null, "earliestEnd":"0950-01-01", "latestEnd":null }, "externalReferences":[ { "type":"closeMatch", "identifier":"https://www.geonames.org/2783054", "referenceSystem":"GeoNames" } ], "references":[ { "id":36560, "abbreviation":"Friesinger 1971-1974", "title":"Herwig Friesinger, Studien zur Archäologie der Slawen in Niederösterreich. Mitteilungen der Prähistorischen Kommission 15/16 (Wien 1971-74).", "pages":"68" } ], "files":null, "types":[ { "id":5099, "rootId":5097, "name":"Excavation", "path":"Evidence > Archaeology", "value":null, "unit":null } ] } }, { "id":158277, "parentId":158275, "rootId":158275, "openatlasClassName":"feature", "crmClass":"E18", "created":"2021-03-17 14:30:16.736204", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Grave 001", "aliases":null, "description":"", "standardType":{ "name":"Single Grave", "id":26205, "rootId":13362, "path":"Grave > Feature" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161492, "name":"Absdorf_001", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":15679, "rootId":15678, "name":"Height", "path":"Dimensions", "value":40, "unit":"cm" }, { "id":22300, "rootId":5118, "name":"Earth pit", "path":"Grave Construction", "value":null, "unit":null }, { "id":22309, "rootId":22308, "name":"Flat grave", "path":"Grave Shape", "value":null, "unit":null } ] } }, { "id":158281, "parentId":158275, "rootId":158275, "openatlasClassName":"feature", "crmClass":"E18", "created":"2021-03-17 14:40:17.421455", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Grave 002", "aliases":null, "description":"", "standardType":{ "name":"Single Grave", "id":26205, "rootId":13362, "path":"Grave > Feature" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161493, "name":"Absdorf_002", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":15679, "rootId":15678, "name":"Height", "path":"Dimensions", "value":205, "unit":"cm" }, { "id":22300, "rootId":5118, "name":"Earth pit", "path":"Grave Construction", "value":null, "unit":null }, { "id":22309, "rootId":22308, "name":"Flat grave", "path":"Grave Shape", "value":null, "unit":null } ] } }, { "id":158277, "parentId":158275, "rootId":158275, "openatlasClassName":"feature", "crmClass":"E18", "created":"2021-03-17 14:30:16.736204", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":[ 158279 ], "properties":{ "name":"Grave 001", "aliases":null, "description":"", "standardType":{ "name":"Single Grave", "id":26205, "rootId":13362, "path":"Grave > Feature" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161492, "name":"Absdorf_001", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":15679, "rootId":15678, "name":"Height", "path":"Dimensions", "value":40, "unit":"cm" }, { "id":22300, "rootId":5118, "name":"Earth pit", "path":"Grave Construction", "value":null, "unit":null }, { "id":22309, "rootId":22308, "name":"Flat grave", "path":"Grave Shape", "value":null, "unit":null } ] } }, { "id":158281, "parentId":158275, "rootId":158275, "openatlasClassName":"feature", "crmClass":"E18", "created":"2021-03-17 14:40:17.421455", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":[ 158283 ], "properties":{ "name":"Grave 002", "aliases":null, "description":"", "standardType":{ "name":"Single Grave", "id":26205, "rootId":13362, "path":"Grave > Feature" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161493, "name":"Absdorf_002", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":15679, "rootId":15678, "name":"Height", "path":"Dimensions", "value":205, "unit":"cm" }, { "id":22300, "rootId":5118, "name":"Earth pit", "path":"Grave Construction", "value":null, "unit":null }, { "id":22309, "rootId":22308, "name":"Flat grave", "path":"Grave Shape", "value":null, "unit":null } ] } }, { "id":158283, "parentId":158281, "rootId":158275, "openatlasClassName":"stratigraphic_unit", "crmClass":"E18", "created":"2021-03-17 14:41:19.106407", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":[ 158285, 158287, 158289, 158291 ], "properties":{ "name":"Burial 002", "aliases":null, "description":"", "standardType":{ "name":"Skeleton", "id":26519, "rootId":13365, "path":"Burial (strat. Unit) > Stratigraphic unit" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":null, "types":[ { "id":22283, "rootId":22276, "name":"Grown up", "path":"Anthropology > Age", "value":null, "unit":null }, { "id":128052, "rootId":128046, "name":"Supine position", "path":"Body posture > General", "value":null, "unit":null }, { "id":120168, "rootId":119049, "name":"Female", "path":"Gender", "value":null, "unit":null }, { "id":158201, "rootId":158197, "name":"West-East", "path":"Orientation", "value":null, "unit":null } ] } }, { "id":158285, "parentId":158283, "rootId":158275, "openatlasClassName":"artifact", "crmClass":"E22", "created":"2021-03-17 14:48:50.584367", "modified":"2021-11-18 10:30:37.672755", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Find 001 (NÖLM Inv. No. 6993)", "aliases":null, "description":"Pressed, bulbous pot, heavily marbled with limestones and graphite chunks. Level base. Slightly curved, conical wall part. Spherical shoulder. Mouth rim tapering diagonally to the top and cut off with a sharp rim diagonally to the bottom. Traces of molded wood on the inside of the mouth seam. The shoulder is decorated with two rows of irregular, obliquely positioned, comb-shaped inlays. At the shoulder border and just below it, two rows of four-lined, closely set, circumferential wavy bands.\r\nNÖLM, Inv. No. 6993", "standardType":{ "name":"Pot", "id":26301, "rootId":157754, "path":"Pottery > Artifact" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161495, "name":"Absdorf_002_01", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":156461, "rootId":141697, "name":"Brown", "path":"Color", "value":null, "unit":null }, { "id":141703, "rootId":141697, "name":"Black", "path":"Color", "value":null, "unit":null }, { "id":150412, "rootId":15678, "name":"Bottom diameter", "path":"Dimensions > Diameter", "value":8.6, "unit":"cm" }, { "id":150696, "rootId":15678, "name":"Max Diameter", "path":"Dimensions > Diameter", "value":16.2, "unit":"cm" }, { "id":150413, "rootId":15678, "name":"Top Diameter", "path":"Dimensions > Diameter", "value":13.2, "unit":"cm" }, { "id":15679, "rootId":15678, "name":"Height", "path":"Dimensions", "value":15.7, "unit":"cm" }, { "id":26561, "rootId":21160, "name":"Ceramic", "path":"Material > Geological > Clay", "value":0, "unit":"weight percentage (0 = unknown)" }, { "id":23448, "rootId":23440, "name":"Foot left", "path":"Position of Find in Grave > Lower Body > Leg Left", "value":null, "unit":null } ] } }, { "id":158287, "parentId":158283, "rootId":158275, "openatlasClassName":"artifact", "crmClass":"E22", "created":"2021-03-17 14:50:10.530195", "modified":"2021-11-18 10:30:37.672755", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Find 002 (NÖLM Inv. No. 6994)", "aliases":null, "description":"Shells of three eggs were found next to the skull.\r\nNÖLM Inv. No. 6994", "standardType":{ "name":"Eggshells", "id":26501, "rootId":157754, "path":"Food Offering > Cult Object > Artifact" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":null, "types":[ { "id":26536, "rootId":21160, "name":"Egg Shell", "path":"Material > Organic > Animal", "value":0, "unit":"weight percentage (0 = unknown)" }, { "id":23441, "rootId":23440, "name":"Head", "path":"Position of Find in Grave", "value":null, "unit":null }, { "id":128787, "rootId":128783, "name":"Number", "path":"Count", "value":3, "unit":"pcs." } ] } }, { "id":158289, "parentId":158283, "rootId":158275, "openatlasClassName":"artifact", "crmClass":"E22", "created":"2021-03-17 14:52:22.473759", "modified":"2021-11-18 10:30:37.672755", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Find 003 (NÖLM Inv. No. 6994)", "aliases":null, "description":"One eggshell was found underneath the pot (see find 001) that was placed next to the left foot.\r\nNÖLM Inv. No. 6994", "standardType":{ "name":"Eggshells", "id":26501, "rootId":157754, "path":"Food Offering > Cult Object > Artifact" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":null, "types":[ { "id":26536, "rootId":21160, "name":"Egg Shell", "path":"Material > Organic > Animal", "value":0, "unit":"weight percentage (0 = unknown)" }, { "id":23448, "rootId":23440, "name":"Foot left", "path":"Position of Find in Grave > Lower Body > Leg Left", "value":null, "unit":null } ] } }, { "id":158295, "parentId":158275, "rootId":158275, "openatlasClassName":"feature", "crmClass":"E18", "created":"2021-03-17 14:58:08.057725", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":[ 158297 ], "properties":{ "name":"Grave 003", "aliases":null, "description":"", "standardType":{ "name":"Single Grave", "id":26205, "rootId":13362, "path":"Grave > Feature" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":[ { "id":161494, "name":"Absdorf_003", "fileName":null, "license":"Bildzitat", "source":null } ], "types":[ { "id":22300, "rootId":5118, "name":"Earth pit", "path":"Grave Construction", "value":null, "unit":null }, { "id":22309, "rootId":22308, "name":"Flat grave", "path":"Grave Shape", "value":null, "unit":null } ] } }, { "id":158297, "parentId":158295, "rootId":158275, "openatlasClassName":"stratigraphic_unit", "crmClass":"E18", "created":"2021-03-17 14:59:15.987592", "modified":"2021-05-17 10:30:53.290824", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":[ 158299 ], "properties":{ "name":"Burial 001", "aliases":null, "description":"The burial is disturbed - the skull is missing and the rib cage not in situ.", "standardType":{ "name":"Skeleton", "id":26519, "rootId":13365, "path":"Burial (strat. Unit) > Stratigraphic unit" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":null, "types":[ { "id":22283, "rootId":22276, "name":"Grown up", "path":"Anthropology > Age", "value":null, "unit":null }, { "id":145466, "rootId":145463, "name":"Disturbed", "path":"Condition of Burial", "value":null, "unit":null }, { "id":120167, "rootId":119049, "name":"Male", "path":"Gender", "value":null, "unit":null } ] } }, { "id":158299, "parentId":158297, "rootId":158275, "openatlasClassName":"artifact", "crmClass":"E22", "created":"2021-03-17 15:00:02.579510", "modified":"2021-11-18 10:30:37.672755", "latestModRec":"2021-11-18 10:30:37.672755", "geometry":{ "type":"GeometryCollection", "geometries":[ ] }, "children":null, "properties":{ "name":"Find 001", "aliases":null, "description":"A piece is iron is mentioned in the find report but lost without a trace.", "standardType":{ "name":"Fragment", "id":123766, "rootId":157754, "path":"Varia > Artifact" }, "timespan":{ "earliestBegin":null, "latestBegin":null, "earliestEnd":null, "latestEnd":null }, "externalReferences":null, "references":null, "files":null, "types":[ { "id":26540, "rootId":21160, "name":"Iron", "path":"Material > Metal", "value":0, "unit":"weight percentage (0 = unknown)" } ] } } ] }
ID of current entity
<id>50625</id>
ID of the absolute root (mostly place)
<rootId>50625</rootId>
ID of direct parent, if no parent exist, then it is empty
<parentId>50625<parentId>
<parentId/>
OpenAtlas internal name of the entity class
<openatlasClassName>place</openatlasClassName>
CIDOC CRM class
<crmClass>E18</crmClass>
Datetime of creation, last modification and also when the last modification for the whole unit was
<created>2015-01-07 13:27:23</created>
<modified>2021-11-18 13:31:06.467487</modified>
<latestModRec>2021-11-18T13:31:06.467487</latestModRec>
Geometry of the entity. Type refers to Point, Polygon, Linestring or GeometryCollection. <coordinates> are always lists, and <coordinate> represents a single set of longitude and latitude decimal values. <title> and <description> are text fields to specify the given place
<geometry>
<type>Point</type>
<coordinates>
<coordinate>
<longitude>16.3721557062619</longitude>
<latitude>48.5592242288297</latitude>
</coordinate>
</coordinates>
<title/>
<description/>
</geometry>
Children are the subunits "under" the entity, so place > feature > stratigraphic unit > artefact/human remains. <children> is also a list with <child> items.
<children>
<child>65519</child>
<child>65517</child>
<child>65535</child>
<child>63517</child>
<child>57199</child>
<child>62891</child>
</children>
Properties include all specific details about the entity. The following tags are included in the <properties /> tag.
<properties>
...
</properties>
Name of the entity
<name>Oberleiserberg</name>
Aliases is a list of different names for the entity
<aliases>
<alias>Oberleiserberg Cementary</alias>
</aliases>
<aliases />
Description of the entity
<description>Early medieval inhumation grave cemetery on the Oberleiserberg (mountain).
Parcel numbers: OBERLEIS 22; 24/1; 25/3
##German
Frühmittelalterliches Körpergräberfeld auf dem Oberleiserberg.
Parzellennummern: OBERLEIS 22; 24/1; 25/3</description>
If the type has a standard type, it will be displayed here (currently not working)
<standardType/>
Timespan of the entity. We have four dates for timespans.
<timespan>
<earliestBegin>0900-01-01</earliestBegin>
<latestBegin>0900-12-31</latestBegin>
<earliestEnd>1100-01-01</earliestEnd>
<latestEnd>1100-12-31</latestEnd>
</timespan>
External references are reference systems like wikidata and geonames. This is a list.
<externalReferences>
<externalReference>
<identifier>https://www.geonames.org/2769960</identifier>
<type>closeMatch</type>
<referenceSystem>GeoNames</referenceSystem>
</externalReference>
</externalReferences>
References are bibliographic entries for the entity. This is a list.
<references>
<reference>
<abbreviation>Brundke [in preparation]</abbreviation>
<id>126168</id>
<title>Nina Brundke, Das arpadenzeitliche Gräberfeld auf dem Oberleiserberg - Interdisziplinäre Studien zur Lebenssituation einer Population am Ende des Frühmittelalters, unveröffentlichte Doktorarbeit (in Vorbereitung), Universität Wien.</title>
<pages/>
</reference>
</references>
Files are, mostly, pictures connected to the entity. This is a list. Idea: Provide external file link?
<files>
<file>
<id>117992</id>
<name>oberleiserberg_map</name>
<fileName>117992.png</fileName>
<license>CC BY-SA 4.0</license>
<source/>
</file>
</files>
Types describes the entity in a standardized way. This is a list. Path is a string path to contextualize the type. rootID is the absolute root for the type.
<types>
<type>
<id>25176</id>
<name>Medieval</name>
<path>Stylistic Classification > History</path>
<rootId>161607</rootId>
</type>
<type>
<id>22378</id>
<name>Inhumation Cemetery</name>
<path>Place > Burial Site > Cemetery</path>
<rootId>5083</rootId>
</type>
<type>
<id>161890</id>
<name>Erstbeleg</name>
<value>1999.0</value>
<unit>Year</unit>
<path>JenaTübingen Value Types</path>
<rootId>161889</rootId>
</type>
</types>
<?xml version="1.0"?>
<collection>
<place>
<id>158275</id>
<rootId>158275</rootId>
<parentId/>
<openatlasClassName>place</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:28:58.370691</created>
<modified>2021-07-21 07:40:08.918682</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>Point</type>
<coordinates>
<coordinate>
<longitude>15.61921</longitude>
<latitude>48.30621</latitude>
</coordinate>
</coordinates>
<title>Absdorf</title>
<description>Absdorf (2783054), imported from GeoNames</description>
</geometry>
<children>
<child>158277</child>
<child>158281</child>
<child>158295</child>
</children>
<properties>
<name>Absdorf Statzendorf</name>
<aliases/>
<description>In 1933, three inhumation burials were found on plot 31in Absdorf (Statzendorf). They were retrieved for the Niederösterreichisches Landesmuseum.</description>
<standardType>
<id>22378</id>
<name>Inhumation Cemetery</name>
<path>Cemetery > Burial Site > Place</path>
<rootId>73</rootId>
</standardType>
<timespan>
<earliestBegin>0700-01-01</earliestBegin>
<latestBegin/>
<earliestEnd>0950-01-01</earliestEnd>
<latestEnd/>
</timespan>
<externalReferences>
<externalReference>
<identifier>https://www.geonames.org/2783054</identifier>
<type>closeMatch</type>
<referenceSystem>GeoNames</referenceSystem>
</externalReference>
</externalReferences>
<references>
<reference>
<abbreviation>Friesinger 1971-1974</abbreviation>
<id>36560</id>
<title>Herwig Friesinger, Studien zur Archäologie der Slawen in Niederösterreich. Mitteilungen der Prähistorischen Kommission 15/16 (Wien 1971-74).</title>
<pages>68</pages>
</reference>
</references>
<files/>
<types>
<type>
<id>5099</id>
<name>Excavation</name>
<path>Evidence > Archaeology</path>
<rootId>5097</rootId>
</type>
</types>
</properties>
</place>
<feature>
<id>158277</id>
<rootId>158275</rootId>
<parentId>158275</parentId>
<openatlasClassName>feature</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:30:16.736204</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Grave 001</name>
<aliases/>
<description/>
<standardType>
<id>26205</id>
<name>Single Grave</name>
<path>Grave > Feature</path>
<rootId>13362</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161492</id>
<name>Absdorf_001</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>15679</id>
<name>Height</name>
<value>40.0</value>
<unit>cm</unit>
<path>Dimensions</path>
<rootId>15678</rootId>
</type>
<type>
<id>22300</id>
<name>Earth pit</name>
<path>Grave Construction</path>
<rootId>5118</rootId>
</type>
<type>
<id>22309</id>
<name>Flat grave</name>
<path>Grave Shape</path>
<rootId>22308</rootId>
</type>
</types>
</properties>
</feature>
<feature>
<id>158281</id>
<rootId>158275</rootId>
<parentId>158275</parentId>
<openatlasClassName>feature</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:40:17.421455</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Grave 002</name>
<aliases/>
<description/>
<standardType>
<id>26205</id>
<name>Single Grave</name>
<path>Grave > Feature</path>
<rootId>13362</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161493</id>
<name>Absdorf_002</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>15679</id>
<name>Height</name>
<value>205.0</value>
<unit>cm</unit>
<path>Dimensions</path>
<rootId>15678</rootId>
</type>
<type>
<id>22300</id>
<name>Earth pit</name>
<path>Grave Construction</path>
<rootId>5118</rootId>
</type>
<type>
<id>22309</id>
<name>Flat grave</name>
<path>Grave Shape</path>
<rootId>22308</rootId>
</type>
</types>
</properties>
</feature>
<feature>
<id>158277</id>
<rootId>158275</rootId>
<parentId>158275</parentId>
<openatlasClassName>feature</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:30:16.736204</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children>
<child>158279</child>
</children>
<properties>
<name>Grave 001</name>
<aliases/>
<description/>
<standardType>
<id>26205</id>
<name>Single Grave</name>
<path>Grave > Feature</path>
<rootId>13362</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161492</id>
<name>Absdorf_001</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>15679</id>
<name>Height</name>
<value>40.0</value>
<unit>cm</unit>
<path>Dimensions</path>
<rootId>15678</rootId>
</type>
<type>
<id>22300</id>
<name>Earth pit</name>
<path>Grave Construction</path>
<rootId>5118</rootId>
</type>
<type>
<id>22309</id>
<name>Flat grave</name>
<path>Grave Shape</path>
<rootId>22308</rootId>
</type>
</types>
</properties>
</feature>
<feature>
<id>158281</id>
<rootId>158275</rootId>
<parentId>158275</parentId>
<openatlasClassName>feature</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:40:17.421455</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children>
<child>158283</child>
</children>
<properties>
<name>Grave 002</name>
<aliases/>
<description/>
<standardType>
<id>26205</id>
<name>Single Grave</name>
<path>Grave > Feature</path>
<rootId>13362</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161493</id>
<name>Absdorf_002</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>15679</id>
<name>Height</name>
<value>205.0</value>
<unit>cm</unit>
<path>Dimensions</path>
<rootId>15678</rootId>
</type>
<type>
<id>22300</id>
<name>Earth pit</name>
<path>Grave Construction</path>
<rootId>5118</rootId>
</type>
<type>
<id>22309</id>
<name>Flat grave</name>
<path>Grave Shape</path>
<rootId>22308</rootId>
</type>
</types>
</properties>
</feature>
<stratigraphic_unit>
<id>158283</id>
<rootId>158275</rootId>
<parentId>158281</parentId>
<openatlasClassName>stratigraphic_unit</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:41:19.106407</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children>
<child>158285</child>
<child>158287</child>
<child>158289</child>
<child>158291</child>
</children>
<properties>
<name>Burial 002</name>
<aliases/>
<description/>
<standardType>
<id>26519</id>
<name>Skeleton</name>
<path>Burial (strat. Unit) > Stratigraphic unit</path>
<rootId>13365</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files/>
<types>
<type>
<id>22283</id>
<name>Grown up</name>
<path>Anthropology > Age</path>
<rootId>22276</rootId>
</type>
<type>
<id>128052</id>
<name>Supine position</name>
<path>Body posture > General</path>
<rootId>128046</rootId>
</type>
<type>
<id>120168</id>
<name>Female</name>
<path>Gender</path>
<rootId>119049</rootId>
</type>
<type>
<id>158201</id>
<name>West-East</name>
<path>Orientation</path>
<rootId>158197</rootId>
</type>
</types>
</properties>
</stratigraphic_unit>
<artifact>
<id>158285</id>
<rootId>158275</rootId>
<parentId>158283</parentId>
<openatlasClassName>artifact</openatlasClassName>
<crmClass>E22</crmClass>
<created>2021-03-17 14:48:50.584367</created>
<modified>2021-11-18 10:30:37.672755</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Find 001 (NÖLM Inv. No. 6993)</name>
<aliases/>
<description>Pressed, bulbous pot, heavily marbled with limestones and graphite chunks. Level base. Slightly curved, conical wall part. Spherical shoulder. Mouth rim tapering diagonally to the top and cut off with a sharp rim diagonally to the bottom. Traces of molded wood on the inside of the mouth seam. The shoulder is decorated with two rows of irregular, obliquely positioned, comb-shaped inlays. At the shoulder border and just below it, two rows of four-lined, closely set, circumferential wavy bands. NÖLM, Inv. No. 6993</description>
<standardType>
<id>26301</id>
<name>Pot</name>
<path>Pottery > Artifact</path>
<rootId>157754</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161495</id>
<name>Absdorf_002_01</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>156461</id>
<name>Brown</name>
<path>Color</path>
<rootId>141697</rootId>
</type>
<type>
<id>141703</id>
<name>Black</name>
<path>Color</path>
<rootId>141697</rootId>
</type>
<type>
<id>150412</id>
<name>Bottom diameter</name>
<value>8.6</value>
<unit>cm</unit>
<path>Dimensions > Diameter</path>
<rootId>15678</rootId>
</type>
<type>
<id>150696</id>
<name>Max Diameter</name>
<value>16.2</value>
<unit>cm</unit>
<path>Dimensions > Diameter</path>
<rootId>15678</rootId>
</type>
<type>
<id>150413</id>
<name>Top Diameter</name>
<value>13.2</value>
<unit>cm</unit>
<path>Dimensions > Diameter</path>
<rootId>15678</rootId>
</type>
<type>
<id>15679</id>
<name>Height</name>
<value>15.7</value>
<unit>cm</unit>
<path>Dimensions</path>
<rootId>15678</rootId>
</type>
<type>
<id>26561</id>
<name>Ceramic</name>
<value>0.0</value>
<unit>weight percentage (0 = unknown)</unit>
<path>Material > Geological > Clay</path>
<rootId>21160</rootId>
</type>
<type>
<id>23448</id>
<name>Foot left</name>
<path>Position of Find in Grave > Lower Body > Leg Left</path>
<rootId>23440</rootId>
</type>
</types>
</properties>
</artifact>
<artifact>
<id>158287</id>
<rootId>158275</rootId>
<parentId>158283</parentId>
<openatlasClassName>artifact</openatlasClassName>
<crmClass>E22</crmClass>
<created>2021-03-17 14:50:10.530195</created>
<modified>2021-11-18 10:30:37.672755</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Find 002 (NÖLM Inv. No. 6994)</name>
<aliases/>
<description>Shells of three eggs were found next to the skull. NÖLM Inv. No. 6994</description>
<standardType>
<id>26501</id>
<name>Eggshells</name>
<path>Food Offering > Cult Object > Artifact</path>
<rootId>157754</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files/>
<types>
<type>
<id>26536</id>
<name>Egg Shell</name>
<value>0.0</value>
<unit>weight percentage (0 = unknown)</unit>
<path>Material > Organic > Animal</path>
<rootId>21160</rootId>
</type>
<type>
<id>23441</id>
<name>Head</name>
<path>Position of Find in Grave</path>
<rootId>23440</rootId>
</type>
<type>
<id>128787</id>
<name>Number</name>
<value>3.0</value>
<unit>pcs.</unit>
<path>Count</path>
<rootId>128783</rootId>
</type>
</types>
</properties>
</artifact>
<artifact>
<id>158289</id>
<rootId>158275</rootId>
<parentId>158283</parentId>
<openatlasClassName>artifact</openatlasClassName>
<crmClass>E22</crmClass>
<created>2021-03-17 14:52:22.473759</created>
<modified>2021-11-18 10:30:37.672755</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Find 003 (NÖLM Inv. No. 6994)</name>
<aliases/>
<description>One eggshell was found underneath the pot (see find 001) that was placed next to the left foot. NÖLM Inv. No. 6994</description>
<standardType>
<id>26501</id>
<name>Eggshells</name>
<path>Food Offering > Cult Object > Artifact</path>
<rootId>157754</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files/>
<types>
<type>
<id>26536</id>
<name>Egg Shell</name>
<value>0.0</value>
<unit>weight percentage (0 = unknown)</unit>
<path>Material > Organic > Animal</path>
<rootId>21160</rootId>
</type>
<type>
<id>23448</id>
<name>Foot left</name>
<path>Position of Find in Grave > Lower Body > Leg Left</path>
<rootId>23440</rootId>
</type>
</types>
</properties>
</artifact>
<feature>
<id>158295</id>
<rootId>158275</rootId>
<parentId>158275</parentId>
<openatlasClassName>feature</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:58:08.057725</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children>
<child>158297</child>
</children>
<properties>
<name>Grave 003</name>
<aliases/>
<description/>
<standardType>
<id>26205</id>
<name>Single Grave</name>
<path>Grave > Feature</path>
<rootId>13362</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files>
<file>
<id>161494</id>
<name>Absdorf_003</name>
<fileName/>
<license>Bildzitat</license>
<source/>
</file>
</files>
<types>
<type>
<id>22300</id>
<name>Earth pit</name>
<path>Grave Construction</path>
<rootId>5118</rootId>
</type>
<type>
<id>22309</id>
<name>Flat grave</name>
<path>Grave Shape</path>
<rootId>22308</rootId>
</type>
</types>
</properties>
</feature>
<stratigraphic_unit>
<id>158297</id>
<rootId>158275</rootId>
<parentId>158295</parentId>
<openatlasClassName>stratigraphic_unit</openatlasClassName>
<crmClass>E18</crmClass>
<created>2021-03-17 14:59:15.987592</created>
<modified>2021-05-17 10:30:53.290824</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children>
<child>158299</child>
</children>
<properties>
<name>Burial 001</name>
<aliases/>
<description>The burial is disturbed - the skull is missing and the rib cage not in situ.</description>
<standardType>
<id>26519</id>
<name>Skeleton</name>
<path>Burial (strat. Unit) > Stratigraphic unit</path>
<rootId>13365</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files/>
<types>
<type>
<id>22283</id>
<name>Grown up</name>
<path>Anthropology > Age</path>
<rootId>22276</rootId>
</type>
<type>
<id>145466</id>
<name>Disturbed</name>
<path>Condition of Burial</path>
<rootId>145463</rootId>
</type>
<type>
<id>120167</id>
<name>Male</name>
<path>Gender</path>
<rootId>119049</rootId>
</type>
</types>
</properties>
</stratigraphic_unit>
<artifact>
<id>158299</id>
<rootId>158275</rootId>
<parentId>158297</parentId>
<openatlasClassName>artifact</openatlasClassName>
<crmClass>E22</crmClass>
<created>2021-03-17 15:00:02.579510</created>
<modified>2021-11-18 10:30:37.672755</modified>
<latestModRec>2021-11-18T10:30:37.672755</latestModRec>
<geometry>
<type>GeometryCollection</type>
<geometries/>
</geometry>
<children/>
<properties>
<name>Find 001</name>
<aliases/>
<description>A piece is iron is mentioned in the find report but lost without a trace.</description>
<standardType>
<id>123766</id>
<name>Fragment</name>
<path>Varia > Artifact</path>
<rootId>157754</rootId>
</standardType>
<timespan>
<earliestBegin/>
<latestBegin/>
<earliestEnd/>
<latestEnd/>
</timespan>
<externalReferences/>
<references/>
<files/>
<types>
<type>
<id>26540</id>
<name>Iron</name>
<value>0.0</value>
<unit>weight percentage (0 = unknown)</unit>
<path>Material > Metal</path>
<rootId>21160</rootId>
</type>
</types>
</properties>
</artifact>
</collection>
h2 Sex Estimation
Labels and texts are stored in language specific gettext files located in openatlas/translations. These files can be created or edited with e.g. Poedit even by "non programmers".
Because the OpenAtlas team can only provide English and German translations, offers to add other translations are always welcome.
These commands scan the code base and update the language files.
pybabel extract -F openatlas/translations/babel.cfg -k lazy_gettext -o openatlas/translations/messages.pot . pybabel update -i openatlas/translations/messages.pot -d openatlas/translations
Once updated the language files can be used with Poedit to add/update translations.
git checkout develop git checkout -b my_branch
git checkout develop git pull origin develop
git checkout my_branch git merge develop
git add . git commit -m "Added translation for XX" git push origin my_branch
WARNING existing translations will be deleted! In this example with the de parameter for German at the end of the second command:
pybabel extract -F openatlas/translations/babel.cfg -k lazy_gettext -o openatlas/translations/messages.pot . pybabel init -i openatlas/translations/messages.pot -d openatlas/translations -l de
Most of the strings to be translated are in the Python and HTML source code marked with an underline function, like e.g.
_('actor')
Exception being JavaScript translations for 3rd party code located in openatlas/static/vendor, see README in sub folders there for more details.
results
in entities
lang
parameter to languagelp
to lpf
in JSON_FORMATS
_
in the endpoints and parameters to -
(add link why this is important)To accomplish a truly REST API, a restructure is in order. These are just mere ideas.
OpenAtlas is extended continuously. To keep it viable we agree on workflows and standards:
Although OpenAtlas is developed and tested to install on Linux (Debian to be precise) there are cases one might install it on a Windows system e.g. to try it out or for developing.
We don't recommend to use a Windows installation for a productive use but want to give some instruction how to go about nevertheless.
Keep in mind that these instructions are not maintained as regularly as the ones for the main environment Linux and might therefore not be entirely up to date.
You find a package list at requirements.txt the root of the OpenAtlas application which can be used, e.g. to install software with pip
Further you will need:To be able to install all required packages MV C++ 14.00 or greater is needed. You can get it via https://visualstudio.microsoft.com/visual-cpp-build-tools/ the name of the required package in the build tools is "MSVC... Buildtools" we tested it with "MSVC v143 - VS 2022 C++-x64/x86-Buildtools" but it might work with older versions as well.
The download link for the newest PostgreSQL Version Windows installer can be found on [[https://www.postgresql.org/download/windows/]] in the first line under "Download the installer".
Choose the appropriate version and use the installer to get PostgreSQL, during the process you can choose to use the stack builder tool to install additional extensions. Through this you can also easily obtain and install PostGIS, though there are other ways as well should you want to.
Change locale to en_US UTF-8 otherwise the export tests will fail.
PostgreSQL
Add C:\Program Files\PostgreSQL\13\bin to the Path variable at the environmental variables
7zip
Add C:\Program Files\7-Zip to the Path variable at the environmental variables
Download the newest version from https://imagemagick.org/script/download.php#windows and install it (https://docs.wand-py.org/en/0.6.10/guide/install.html#install-imagemagick-on-windows use this guide).
Important: Install the dll version, not the static one!
Restart your PC to make sure all PATH variables are successfully registered.
openatlas
)openatlas
OpenAtlas/Install/1_structure
OpenAtlas/Install/1_structure
Install
folderproduction.py
" & "testing.py
"¶This is used to set variables for the project configuration. You will find an example in the instance
folder.
example_production.py
in the same folder and rename it to production.py
DATABASE_PASS
to the password you have set, or want, for the OpenAtlas Database.testing.py
repeat the same steps but:
example_testing.py
instead of example_production.py
In the project root directory run the following command:
- pip install -r requirements.txt
Uncomment the lines following "# Used for testing" in "requirements.txt" then execute the following command:
- pip install -r requirements.txt
For test running:
In order to run the tests on Windows you'll need to delete thetests/__init__.py
and then run the command:
git update-index --skip-worktree .\tests\__init__.py
This will prevent your changes to tests/__init__.py
from being commited. For information on how or why this works please look at https://stackoverflow.com/a/14462290 and https://stackoverflow.com/a/13631525 respectively.
python3.9.exe -m nose --verbose --config=.noserc
If you encounter the error "./test_reference_system.py::ReferenceSystemTest::test_reference_system Failed with Error: 'file_upload_max_size'" during testing, you can try deleting all files in the "../files/uploads
" directory, except for ".gitignore". Sometimes, on Windows, images from previous tests may not be cleared, causing subsequent tests to fail.
If you have other files in the directory that you don't want to delete, the files uploaded by the tests are the OpenAtlas logo in both .jpeg and .png formats, as well as one .json file describing OpenAtlas. Removing these files should be sufficient.
Use PowerShell
Get scoop: https://scoop.sh/
OpenAtlas
scoop install git
scoop bucket add versions
scoop bucket add extras
scoop install python39 imagemagick nodejs
Collaboration
scoop install pycharm mattermost
Add a new tmp path to the testing.py where the user can write if /tmp not available, e.g.:
from pathlib import Path
TMP_DIR = Path('C:\\Users\\USERNAME')
Add
LOAD_WINDOWS_TEST_SQL = True
testing.py
to load the SQL for Windows. C:\iiif