Submit a preprint

Latest recommendationsrsstwitter

IdTitle * Authors * Abstract * Picture * Thematic fields * RecommenderReviewersSubmission date▼
02 Apr 2024
article picture

Similarity Network Fusion: Understanding Patterns and their Spatial Significance in Archaeological Datasets

A different approach to similarity networks in Archaeology - Similarity Network Fusion

Recommended by based on reviews by Matthew Peeples and 1 anonymous reviewer

This is a fascinating paper for anyone interested in network analysis or the chronology and cultures of the case study, namely the Late prehistoric burial sites in Dorset, for which the author’s approach allowed a new perspective over an already deeply studied area [1]. This paper's implementation of Similarity Network Fusion (SNF) is noteworthy. This method is typically utilized within genetic research but has yet to be employed in Archaeology. SNF has the potential to benefit Archaeology due to its unique capabilities and approach significantly. 

The author exhibits a deep and thorough understanding of previous investigations concerning material and similarity networks while emphasizing the innovative nature of this particular study. The SNF approach intends to improve a lack of the most used (in Archaeology) similarity coefficient, the Brainerd-Robinson, in certain situations, mainly in heterogenous and noisy datasets containing a small number of samples but a large number of measurements, scale differences, and collection biases, among other things. The SNF technique, demonstrated in the case study, effectively incorporates various similarity networks derived from different datatypes into one network. 

As shown during the Dorset case study, the SNF application has a great application in archaeology, even in already available data, allowing us to go further and bring new visions to the existing interpretations. As stated by the author, SNF shows its potential for other applications and fields in archaeology coping with similar datasets, such as archaeobotany or archaeozoology, and seems to complement different multivariate statistical approaches, such as correspondence or cluster analysis.

This paper has been subject to two excellent revisions, which the author mostly accepted. One of the revisions was more technical, improving the article in the metadata part, data availability and clarification, etc. Although the second revision was more conceptual and gave some excellent technical inputs, it focused more on complementary aspects that will allow the paper to reach a wider audience. I vividly recommend its publication.

References

[1] Geitlinger, T. (2024). Similarity Network Fusion: Understanding Patterns and their Spatial Significance in Archaeological Datasets. Zenodo, 7998239, ver. 3 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7998239

 

Similarity Network Fusion: Understanding Patterns and their Spatial Significance in Archaeological DatasetsTimo Geitlinger<p>Since its earliest application in the 1970s, network analysis has become increasingly popular in both theoretical and GIS-based archaeology. Yet, applications of material networks remained relatively restricted. This paper describes a specific ...Computational archaeology, ProtohistoryJoel Santos2023-06-02 16:51:19 View
06 Oct 2023
article picture

Body Mapping the Digital: Visually representing the impact of technology on archaeological practice.

Understanding archaeological documentation through a participatory, arts-based approach

Recommended by based on reviews by 2 anonymous reviewers

This paper presents the use of a participatory arts-based methodology to understand how digital and analogue tools affect individuals' participation in the process of archaeological recording and interpretation. The preliminary results of this work highlight the importance of rethinking archaeologists' relationship with different recording methods, emphasising the need to recognise the value of both approaches and to adopt a documentation strategy that exploits the strengths of both analogue and digital methods.

Although a larger group of participants with broader and more varied experience would have provided a clearer picture of the impact of technology on current archaeological practice, the article makes an important contribution in highlighting the complex and not always easy transition that archaeologists trained in analogue methods are currently experiencing when using digital technology.

 This is assessed by using arts-based methodologies to enable archaeologists to consider how digital technologies are changing the relationship between mind, body and practice.

I found the range of experiences described in the papers by the archaeologists involved in the experiment particularly interesting and very representative of the change in practice that we are all experiencing.  As the article notes, the two approaches cannot be directly compared because they offer different possibilities: if analogue methods foster a deeper connection with the archaeological material, digital documentation seems to be perceived as more effective in terms of data capture, information exchange and data sharing (Araar et al., 2023).

It seems to me that an important element to consider in such a study is the generational shift and the incredible divide between native and non-native digital.

 The critical issues highlighted in the paper are central and provide important directions for navigating this ongoing (digital) transition.

References

Araar, L., Morgan, C. and Fowler, L. (2023) Body Mapping the Digital: Visually representing the impact of technology on archaeological practice., Zenodo, 7990581, ver. 5 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7990581

Body Mapping the Digital: Visually representing the impact of technology on archaeological practice.Araar, Leila; Morgan, Colleen; Fowler, Louise<p>This paper uses a participatory, art-based methodology to understand how digital and analog tools impact individuals' experience and perceptions of archaeological recording. Body mapping involves the co-creation of life-sized drawings and narra...Computational archaeology, Theoretical archaeologyNicolo Dell'Unto2023-06-01 09:06:52 View
13 Jan 2024
article picture

Dealing with post-excavation data: the Omeka S TiMMA web-database

Managing Archaeological Data with Omeka S

Recommended by ORCID_LOGO based on reviews by Electra Tsaknaki and 1 anonymous reviewer

Managing data in archaeology is a perennial problem. As the adage goes, every day in the field equates to several days in the lab (and beyond). For better or worse, past archaeologists did all their organizing and synthesis manually, by hand, but since the 1970s ways of digitizing data for long term management and analysis have gained increasing attention [1]. It is debatable whether this ever actually made things easier, particularly given the associated problem of sustainable maintenance and accessibility of the data. Many older archaeologists, for instance, still have reels and tapes full of data that now require a new form of archaeology to excavate (see [2] for an unrealized idea on how to solve this).

Today, the options for managing digital archaeological data are limited only by one’s imagination. There are systems built specifically for archaeology, such as Arches [3], Ark [4], Codifi [5], Heurist [6], InTerris Registries [7], OpenAtlas [8], S-Archeo [9], and Wild Note [10], as well as those geared towards museum collections like PastPerfect [11] and CatalogIt [12], among others. There are also mainstream databases that can be adapted to archaeological needs like MS Access [13] and Claris FileMaker [14], as well as various web database apps that function in much the same way (e.g., Caspio [15], dbBee [16], Amazon's Simpledb [17], Sci-Note [18], etc.) — all with their own limitations in size, price, and utility. One could also write the code for specific database needs using pre-built frameworks like those in Ruby-On-Rails [19] or similar languages. And of course, recent advances in machine-learning and AI will undoubtedly bring new solutions in the near future.

But let’s be honest — most archaeologists probably just use Excel. That's partly because, given all the options, it is hard to decide the best tool and whether its worth changing from your current system, especially given few real-world examples in the literature. Bastien Rueff’s new paper [20] is therefore a welcomed presentation on the use of Omeka S [21] to manage data collected for the Timbers in Minoan and Mycenaean Architecture (TiMMA) project. Omeka S is an open-source web-database that is based in PHP and MySQL, and although it was built with the goal of connecting digital cultural heritage collections with other resources online, it has been rarely used in archaeology. Part of the issue is that Omeka Classic was built for use on individual sites, but this has now been scaled-up in Omeka S to accommodate a plurality of sites. 

Some of the strengths of Omeka S include its open-source availability (accessible regardless of budget), the way it links data stored elsewhere on the web (keeping the database itself lean), its ability to import data from common file types, and its multi-lingual support. The latter feature was particularly important to the TiMAA project because it allowed members of the team (ranging from English, Greek, French, and Italian, among others) to enter data into the system in whatever language they felt most comfortable.

However, there are several limitations specific to Omeka S that will limit widespread adoption. Among these, Omeka S apparently lacks the ability to export metadata, auto-fill forms, produce summations or reports, or provide basic statistical analysis. Its internal search capabilities also appear extremely limited. And that is not to mention the barriers typical of any new software, such as onerous technical training, questionable long-term sustainability, or the need for the initial digitization and formatting of data. But given the rather restricted use-case for Omeka S, it appears that this is not a comprehensive tool but one merely for data entry and storage that requires complementary software to carry out common tasks.

As such, Rueff has provided a review of a program that most archaeologists will likely not want or need. But if one was considering adopting Omeka S for a project, then this paper offers critical information for how to go about that. It is a thorough overview of the software package and offers an excellent example of its use in archaeological practice.


NOTES

[1] Doran, J. E., and F. R. Hodson (1975) Mathematics and Computers in Archaeology. Harvard University Press.

[2] Snow, Dean R., Mark Gahegan, C. Lee Giles, Kenneth G. Hirth, George R. Milner, Prasenjit Mitra, and James Z. Wang (2006) Cybertools and Archaeology. Science 311(5763):958–959.

[3] https://www.archesproject.org/

[4] https://ark.lparchaeology.com/

[5] https://codifi.com/

[6] https://heuristnetwork.org/

[7] https://www.interrisreg.org/

[8] https://openatlas.eu/

[9] https://www.skinsoft-lab.com/software/archaelogy-collection-management

[10] https://wildnoteapp.com/

[11] https://museumsoftware.com/

[12] https://www.catalogit.app/

[13] https://www.microsoft.com/en-us/microsoft-365/access

[14] https://www.claris.com/filemaker/

[15] https://www.caspio.com/

[16] https://www.dbbee.com/

[17] https://aws.amazon.com/simpledb/

[18] https://www.scinote.net/

[19] https://rubyonrails.org/

[20] Rueff, Bastien (2023) Dealing with Post-Excavation Data: The Omeka S TiMMA Web-Database. peer-reviewed and recommended by Peer Community in Archaeology. https://zenodo.org/records/7989905

[21] https://omeka.org/

 

Dealing with post-excavation data: the Omeka S TiMMA web-databaseBastien Rueff<p>This paper reports on the creation and use of a web database designed as part of the TiMMA project with the Content Management System Omeka S. Rather than resulting in a technical manual, its goal is to analyze the relevance of using Omeka S in...Buildings archaeology, Computational archaeologyJonathan Hanna2023-05-31 12:16:25 View
29 Jan 2024
article picture

Visual encoding of a 3D virtual reconstruction's scientific justification: feedback from a proof-of-concept research

3D Models, Knowledge and Visualization: a prototype for 3D virtual models according to plausible criteria

Recommended by based on reviews by Robert Bischoff and Louise Tharandt

The construction of 3D realities is deeply embedded in archaeological practices. From sites to artifacts, archaeology has dedicated itself to creating digital copies for the most varied purposes. The paper “Visual encoding of a 3D virtual reconstruction's 3 scientific justification: feedback from a proof-of-concept research” (Jean-Yves et al 2024) represents an advance, in the sense that it does not just deal with a three-dimensional theory for archaeological practice, but rather offers proposals regarding the epistemic component, how it is possible to represent knowledge through the workflow of 3D virtual reconstructions themselves. The authors aim to unite three main axes - knowledge modeling, visual encoding and 3D content reuse - (Jean-Yves et al 2024: 2), which, for all intents and purposes, form the basis of this article. With regard to the first aspect, this work questions how it is possible to transmit the knowledge we want to a 3D model and how we can optimize this epistemic component. A methodology based on plausibility criteria is offered, which, for the archaeological field, offers relevant space for reflection. Given our inability to fully understand the object or site that is the subject of the 3D representation, whether in space or time, building a method based on probabilistic categories is probably one of the most realistic approaches to the realities of the past.

Thus, establishing a plausibility criterion allows the user to question the knowledge that is transmitted through the representation, and can corroborate or refute it in future situations. This is because the role of reusing these models is of great interest to the authors, a perfectly justifiable sentiment, as it encourages a critical view of scientific practices. Visual encoding is, in terms of its conjunction with knowledge practices, a key element. The notion of simplicity under Maeda's (2006) design principles not only represents a way of thinking that favors operability, but also a user-friendly design in the prototype that the authors have created. This is also visible when it comes to the reuse of parts of the models, in a chronological logic: adapting the models based on architectural elements that can be removed or molded is a testament to intelligent design, whereby instead of redoing models in their entirety, they are partially used for other purposes.

All these factors come together in the final prototype, a web application that combines relational databases (RDBMS) with a data mapper (MassiveJS), using the PHP programming language.  The example used is the Marmoutier Abbey hostelry, a centuries-old building which, according to the sources presented, has evolved architecturally over several centuries ((Jean-Yves et al 2024: 8). These states of the building are represented visually through architectural elements based on their existence, location, shape and size, always in terms of what is presented as being plausible. This allows not only the creation of a matrix in which various categories are related to various architectural elements, but also a visual aid, through a chromatic spectrum, of the plausibility that the authors are aiming for. 

In short, this is an article that seeks to rethink the degree of knowledge we can obtain through 3D visualizations and that does not take models as static, but rather realities that must be explored, recycled and reinterpreted in the light of different data, users and future research. For this reason, it is a work of great relevance to theoretical advances in 3D modeling adapted to archaeology.

 

References

Blaise, J.-Y., Dudek, I., Bergerot, L. and Gaël, S. (2024). Visual encoding of a 3D virtual reconstruction's scientific justification: feedback from a proof-of-concept research, Zenodo, 7983163, ver. 3 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.10496540

John Maeda. (2006). The Laws of Simplicity. MIT Press, Cambridge, MA, USA.

Visual encoding of a 3D virtual reconstruction's scientific justification: feedback from a proof-of-concept researchJ.Y Blaise, I.Dudek, L.Bergerot, G.Simon<p>&nbsp;3D virtual reconstructions have become over the last decades a classical mean to communicate &nbsp;about analysts’ visions concerning past stages of development of an edifice or a site. However, they still today remain quite often a one-s...Computational archaeology, Spatial analysisDaniel Carvalho2023-05-30 00:43:03 View
05 Jan 2024
article picture

Transforming the CIDOC-CRM model into a megalithic monument property graph

Informative description of a project implementing a CIDOC-CRM based native graph database for representing megalithic information

Recommended by based on reviews by 2 anonymous reviewers

The paper “Transforming the CIDOC-CRM model into a megalithic monument property graph” describes an interesting endeavour of developing and implementing a CIDOC-CRM based knowledge graph using a native graph database (Neo4J) to represent megalithic information (Câmara et al. 2023). While there are earlier examples of using native graph databases and CIDOC-CRM in diverse heritage contexts, the present paper is useful addition to the literature as a detailed description of an implementation in the context of megalithic heritage. The paper provides a demonstration of a working implementation, and guidance for future projects. The described project is also documented to an extent that the paper will open up interesting opportunities to compare the approach to previous and forthcoming implementations. The same applies to the knowledge graph and use of CIDOC-CRM in the project.

Readers interested in comparing available technologies and those who are developing their own knowledge graphs might have benefited of a more detailed description of the work in relation to the current state-of-the-art and what the use of a native graph database in the built-heritage contexts implies in practice for heritage documentation beyond that it is possible and it has potentially meaningful performance-related advantages. While also the reasons to rely on using plain CIDOC-CRM instead of extensions could have been discussed in more detail, the approach demonstrates how the plain CIDOC-CRM provides a good starting point to satisfy many heritage documentation needs.

As a whole, the shortcomings relating to positioning the work to the state-of-the-art and reflecting and discussing design choices do not reduce the value of the paper as a valuable case description for those interested in the use of native graph databases and CIDOC-CRM in heritage documentation in general and the documentation of megalithic heritage in particular.

References

Câmara, A., de Almeida, A. and Oliveira, J. (2023). Transforming the CIDOC-CRM model into a megalithic monument property graph, Zenodo, 7981230, ver. 4 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7981230

Transforming the CIDOC-CRM model into a megalithic monument property graphAriele Câmara, Ana de Almeida, João Oliveira <p>This paper presents a method to store information about megalithic monuments' building components as graph nodes in a knowledge graph (KG). As a case study we analyse the dolmens from the region of Pavia (Portugal). To build the KG, information...Computational archaeologyIsto Huvila2023-05-29 13:46:49 View
29 Aug 2023
article picture

Designing Stories from the Grave: Reviving the History of a City through Human Remains and Serious Games

AR and VR Gamification as a proof-of-concept

Recommended by ORCID_LOGO based on reviews by Sophie C. Schmidt and Tine Rassalle

Tsaknaki et al. (2023) discuss a work-in-progress project in which the presentation of Cultural Heritage is communicated using Serious Games techniques in a story-centric immersive narration instead of an exhibit-centered presentation with the use of Gamification, Augmented and Virtual Reality technologies. In the introduction the authors present the project called ECHOES, in which knowledge about the past of Thessaloniki, Greece is planned to be processed as an immersive and interactive experience. After presenting related work and the methodology, the authors describe the proposed design of the Serious Game and close the article with a discussion and conclusions.

The paper is interesting because it highlights an ongoing process in the realm of the visualization of Cultural Heritage (see for example Champion 2016). The process described by the authors on how to accomplish this by using Serious Games, Gamification, Augmented and Virtual Reality is promising, although still hypothetical as the project is ongoing. It remains to be seen if the proposed visuals and interactive elements will work in the way intended and offer users an immersive experience after all. A preliminary questionnaire already showed that most of the respondents were not familiar with these technologies (AR, VR) and in my experience these numbers only change slowly. One way to overcome the technological barrier however might be the gamification of the experience, which the authors are planning to implement.

I decided to recommend this article based on the remarks of the two reviewers, which the authors implemented perfectly, as well as my own evaluation of the paper. Although still in progress it seems worthwhile to have this article as a basis for discussion and comparison to similar projects. However, the article did not mention the possible longevity of data and in which ways the usability of the Serious Game will be secured for long-term storage. One eminent problem in these endeavors is, that we can read about these projects, but never find them anywhere to test them ourselves (see for example Gabellone et al. 2016). It is my intention with this review and the recommendation, that the ECHOES project will find a solution for this problem and that we are not only able to read this (and forthcoming) article(s) about the ECHOES project, but also play the Serious Game they are proposing in the near and distant future.

References


Champion, Erik Malcolm. 2016. „Entertaining the Similarities and Distinctions between Serious Games and Virtual Heritage Projects“. Entertainment Computing 14 (Mai): 67–74. https://doi.org/10.1016/j.entcom.2015.11.003

Gabellone, Francesco, Antonio Lanorte, Nicola Masini, und Rosa Lasaponara. 2016. „From Remote Sensing to a Serious Game: Digital Reconstruction of an Abandoned Medieval Village in Southern Italy“. Journal of Cultural Heritage. https://doi.org/10.1016/j.culher.2016.01.012

Tsaknaki, Electra, Anastasovitis, Eleftherios, Georgiou, Georgia, Alagialoglou, Kleopatra, Mavrokostidou, Maria, Kartsiakli, Vasiliki, Aidonis, Asterios, Protopsalti, Tania, Nikolopoulos, Spiros, and Kompatsiaris, Ioannis. (2023). Designing Stories from the Grave: Reviving the History of a City through Human Remains and Serious Games, Zenodo, 7981323, ver. 4 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7981323

Designing Stories from the Grave: Reviving the History of a City through Human Remains and Serious GamesTsaknaki, Electra; Anastasovitis, Eleftherios; Georgiou, Georgia; Alagialoglou, Kleopatra; Mavrokostidou, Maria; Kartsiakli, Vasiliki; Aidonis, Asterios; Protopsalti, Tania; Nikolopoulos, Spiros; Kompatsiaris, Ioannis<p>The main challenge of the current digital transition is to utilize computing media and cutting-edge technologyin a more meaningful way, which would make the archaeological and anthropological research outcomes relevant to a heterogeneous audien...Bioarchaeology, Computational archaeology, EuropeSebastian Hageneuer2023-05-29 13:19:46 View
31 Jan 2024
article picture

Rivers vs. Roads? A route network model of transport infrastructure in Northern Italy during the Roman period

Modelling Roman Transport Infrastructure in Northern Italy

Recommended by based on reviews by Pau de Soto and Adam Pažout

Studies of the economy of the Roman Empire have become increasingly interdisciplinary and nuanced in recent years, allowing the discipline to make great strides in data collection and importantly in the methods through which this increasing volume of data can be effectively and meaningfully analysed [see for example 1 and 2]. One of the key aspects of modelling the ancient economy is understanding movement and transport costs, and how these facilitated trade, communication and economic development. With archaeologists adopting more computational techniques and utilising GIS analysis beyond simply creating maps for simple visualisation, understanding and modelling the costs of traversing archaeological landscapes has become a much more fruitful avenue of research. Classical archaeologists are often slower to adopt these new computational techniques than others in the discipline. This is despite (or perhaps due to) the huge wealth of data available and the long period of time over which the Roman economy developed, thrived and evolved. This all means that the Roman Empire is a particularly useful proving ground for testing and perfecting new methodological developments, as well as being a particularly informative period of study for understanding ancient human behaviour more broadly. This paper by Page [3] then, is well placed and part of a much needed and growing trend of Roman archaeologists adopting these computational approaches in their research. 

Page’s methodology builds upon De Soto’s earlier modelling of transport costs [4] and applies it in a new setting. This reflects an important practice which should be more widely adopted in archaeology. That of using existing, well documented methodologies in new contexts to offer wider comparisons. This allows existing methodologies to be perfected and tested more robustly without reinventing the wheel. Page does all this well, and not only builds upon De Soto’s work, but does so using a case study that is particularly interesting with convincing and significant results. 

As Page highlights, Northern Italy is often thought of as relatively isolated in terms of economic exchange and transport, largely due to the distance from the sea and the barriers posed by the Alps and Apennines. However, in analysing this region, and not taking such presumptions for granted, Page quite convincingly shows that the waterways of the region played an important role in bringing down the cost of transport and allowed the region to be far more interconnected with the wider Roman world than previous studies have assumed.  

This article is clearly a valuable and important contribution to our understanding of computational methods in archaeology as well as the economy and transport network of the Roman Empire. The article utilises innovative techniques to model transport in an area of the Roman Empire that is often overlooked, with the economic isolation of the area taken for granted. Having high quality research such as this specifically analysing the region using the most current methodologies is of great importance. Furthermore, developing and improving methodologies like this allow for different regions and case studies to be analysed and directly compared, in a way that more traditional analyses simply cannot do. As such, Page has demonstrated the importance of reanalysing traditional assumptions using the new data and analyses now available to archaeologists. 

References

[1] Brughmans, T. and Wilson, A. (eds.) (2022). Simulating Roman Economies: Theories, Methods, and Computational Models. Oxford. 

[2] Dodd, E.K. and Van Limbergen, D. (eds.) (2024). Methods in Ancient Wine Archaeology: Scientific Approaches in Roman Contexts. London ; New York. 

[3] Page, J. (2024). Rivers vs. Roads? A route network model of transport infrastructure in Northern Italy during the Roman period, Zenodo, 7971399, ver. 3 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7971399

[4] De Soto P (2019). Network Analysis to Model and Analyse Roman Transport and Mobility. In: Finding the Limits of the Limes. Modelling Demography, Economy and Transport on the Edge of the Roman Empire. Ed. by Verhagen P, Joyce J, and Groenhuijzen M. Springer Open Access, pp. 271–90. https://doi.org/10.1007/978-3-030-04576-0_13

Rivers vs. Roads? A route network model of transport infrastructure in Northern Italy during the Roman periodJames Page<p>Northern Italy has often been characterised as an isolated and marginal area during the Roman period, a region constricted by mountain ranges and its distance from major shipping lanes. Historians have frequently cited these obstacles, alongsid...Classic, Computational archaeologyAndrew McLean2023-05-28 15:11:31 View
19 Feb 2024
article picture

Social Network Analysis of Ancient Japanese Obsidian Artifacts Reflecting Sampling Bias Reduction

Evaluating Methods for Reducing Sampling Bias in Network Analysis

Recommended by ORCID_LOGO based on reviews by Matthew Peeples and 1 anonymous reviewer

In a recent article, Fumihiro Sakahira and Hiro'omi Tsumura (2023) used social network analysis methods to analyze change in obsidian trade networks in Japan throughout the 13,000-year-long Jomon period. In the paper recommended here (Sakahira and Tsumura 2024), Social Network Analysis of Ancient Japanese Obsidian Artifacts Reflecting Sampling Bias Reduction they revisit that data and describe additional analyses that confirm the robustness of their social network analysis. The data, analysis methods, and substantive conclusions of the two papers overlap; what this new paper adds is a detailed examination of the data and methods, including use of bootstrap analysis to demonstrate the reasonableness of the methods they used to group sites into clusters.

Both papers begin with a large dataset of approximately 21,000 artifacts from more than 250 sites dating to various times throughout the Jomon period. The number of sites and artifacts, varying sample sizes from the sites, as well as the length of the Jomon period, make interpretation of the data challenging. To help make the data easier to interpret and reduce problems with small sample sizes from some sites, the authors assign each site to one of five sub-periods, then define spatial clusters of sites within each period using the DBSCAN algorithm. Sites with at least three other sites within 10 km are joined into clusters, while sites that lack enough close neighbors are left as isolates. Clusters or isolated sites with sample sizes smaller than 30 were dropped, and the remaining sites and clusters became the nodes in the networks formed for each period, using cosine similarities of obsidian assemblages to define the strength of ties between clusters and sites.

The main substantive result of Sakahira and Tsumura’s analysis is the demonstration that, during the Middle Jomon period (5500-4500 cal BP), clusters and isolated sites were much more connected than before or after that period. This is largely due to extensive distribution of obsidian from the Kozu-shima source, located on a small island off the Japanese mainland. Before the Middle Jomon period, Kozu-shima obsidian was mostly found at sites near the coast, but during the Middle Jomon, a trade network developed that took Kozu-shima obsidian far inland. This ended after the Middle Jomon period, and obsidian networks were less densely connected in the late and last Jomon periods.

The methods and conclusions are all previously published (Sakahira and Tsumura 2023). What Sakahira and Tsumura add in Social Network Analysis of Ancient Japanese Obsidian Artifacts Reflecting Sampling Bias Reduction are:

·       an examination of the distribution of cosine similarities between their clusters for each period

·       a similar evaluation of the cosine similarities within each cluster (and among the unclustered sites) for each period

·       bootstrap analyses of the mean cosine similarities and network densities for each time period

These additional analyses demonstrate that the methods used to cluster sites are reasonable, and that the use of spatially defined clusters as nodes (rather than the individual sites within the clusters) works well as a way of reducing bias from small, unrepresentative samples. An alternative way to reduce that bias would be to simply drop small assemblages, but that would mean ignoring data that could usefully contribute to the analysis.

The cosine similarities between clusters show patterns that make sense given the results of the network analysis. The Middle Jomon period has, on average, the highest cosine similarities between clusters, and most cluster pairs have high cosine similarities, consistent with the densely connected, spatially expansive network from that time period. A few cluster pairs in the Middle Jomon have low similarities, apparently representing comparisons including one of the few nodes on the margins on the network that had little or no obsidian from the Kozu-shima source. The other four time periods all show lower average inter-cluster similarities and many cluster pairs have either high or low similarities. This probably reflects the tendency for nearby clusters to have very similar obsidian assemblages to each other and for geographically distant clusters to have dissimilar obsidian assemblages. The pattern is consistent with the less densely connected networks and regionalization shown in the network graphs. Thinking about this pattern makes me want to see a plot of the geographic distances between the clusters against the cosine similarities. There must be a very strong correlation, but it would be interesting to know whether there are any cluster pairs with similarities that deviate markedly from what would be predicted by their geographic separation.

The similarities within clusters are also interesting. For each time period, almost every cluster has a higher average (mean and median) within-cluster similarity than the similarity for unclustered sites, with only two exceptions. This is partial validation of the method used for creating the spatial clusters; sites within the clusters are at least more similar to each other than unclustered sites are, suggesting that grouping them this way was reasonable.

Although Sakahira and Tsumura say little about it, most clusters show quite a wide range of similarities between the site pairs they contain; average within-cluster similarities are relatively high, but many pairs of sites in most clusters appear to have low similarities (the individual values are not reported, but the pattern is clear in boxplots for the first four periods). There may be value in further exploring the occurrence of low site-to-site similarities within clusters. How often are they caused by small sample sizes? Clusters are retained in the analysis if they have a total of at least 30 artifacts, but clusters may contain sites with even smaller sample sizes, and small samples likely account for many of the low similarity values between sites in the same cluster. But is distance between sites in a cluster also a factor? If the most distant sites within a spatially extensive cluster are dissimilar, subdividing the cluster would likely improve the results. Further exploration of these within-cluster site-to-site similarity values might be worth doing, perhaps by plotting the similarities against the size of the smallest sample included in the comparison, as well as by plotting the cosine similarity against the distance between sites. Any low similarity values not attributable to small sample sizes or geographic distance would surely be worth investigating further.

Sakahira and Tsumura also use a bootstrap analysis to simulate, for each time period, mean cosine similarities between clusters and between site pairs without clustering. They also simulate the network density for each time period before and after clustering. These analyses show that, almost always, mean simulated cosine similarities and mean simulated network density are higher after clustering than before. The simulated mean values also match the actual mean values better after clustering than before. This improved match to actual values when the sites are clustered for the bootstrap reinforces the argument that clustering the sites for the network analysis was a reasonable result.

The strength of this paper is that Sakahira and Tsumura return to reevaluate their previously published work, which demonstrated strong patterns through time in the nature and extent of Jomon obsidian trade networks. In the current paper they present further analyses demonstrating that several of their methodological decisions were reasonable and their results are robust. The specific clusters formed with the DBSCAN algorithm may or may not be optimal (which would be unreasonable to expect), but the authors present analyses showing that using spatial clusters does improve their network analysis. Clustering reduces problems with small sample sizes from individual sites and simplifies the network graphs by reducing the number of nodes, which makes the results easier to interpret.

Reference

Sakahira, F. and Tsumura, H. (2023). Tipping Points of Ancient Japanese Jomon Trade Networks from Social Network Analyses of Obsidian Artifacts. Frontiers in Physics 10:1015870. https://doi.org/10.3389/fphy.2022.1015870

Sakahira, F. and Tsumura, H. (2024). Social Network Analysis of Ancient Japanese Obsidian Artifacts Reflecting Sampling Bias Reduction, Zenodo, 10057602, ver. 7 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7969330

Social Network Analysis of Ancient Japanese Obsidian Artifacts Reflecting Sampling Bias ReductionFumihiro Sakahira, Hiro’omi Tsumura<p>This study aims to investigate the dynamics of obsidian trade networks during the Jomon period (approximately 15,000 to 2,400 years ago), the hunting and gathering era in Japan. To improve regional representation and reduce the distortions caus...Asia, Computational archaeologyJames Allison Thegn Ladefoged, Matthew Peeples2023-05-28 05:51:12 View
06 Aug 2023
article picture

A Focus on the Future of our Tiny Piece of the Past: Digital Archiving of a Long-term Multi-participant Regional Project

A meticulous description of archiving research data from a long-running landscape research project

Recommended by based on reviews by Dominik Hagmann and Iwona Dudek

The paper “A Focus on the Future of our Tiny Piece of the Past: Digital Archiving of a Long-term Multi-participant Regional Project” (Madry et al., 2023) describes practices, challenges and opportunities encountered in digital archiving of a landscape research project running in Burgundy, France for more than 45 years. As an unusually long-running multi-disciplinary undertaking working with a large variety of multi-modal digital and non-digital data, the Burgundy project has lived through the development of documentation and archiving technologies from the 1970s until today and faced many of the challenges relating to data management, preservation and migration.

The major strenght of the paper is that it provides a detailed description of the evolution of digital data archiving practices in the project including considerations about why some approaches were tested and abandoned. This differs from much of the earlier literature where it has been more common to describe individual solutions how digital archiving was either planned or was performed at one point of time. A longitudinal description of what was planned, how and why it has worked or failed so far, as described in the paper, provides important insights in the everyday hurdles and ways forward in digital archiving. As a description of a digital archiving initiative, the paper makes a valuable contribution for the data archiving scholarship as a case description of practices and considerations in one research project. For anyone working with data management in a research project either as a researcher or data manager, the text provides useful advice on important practical matters to consider ahead, during and after the project. The main advice the authors are giving, is to plan and act for data preservation from the beginning of the project rather than doing it afterwards. To succeed in this, it is crucial to be knowledgeable of the key concepts of data management—such as “digital data fixity, redundant backups, paradata, metadata, and appropriate keywords” as the authors underline—including their rationale and practical implications. The paper shows also that when and if unexpected issues raise, it is important to be open for different alternatives, explore ways forward, and in general be flexible.

The paper makes also a timely contribution to the discussion started at the session “Archiving information on archaeological practices and work in the digital environment: workflows, paradata and beyond” at the Computer Applications and Quantitative 2023 conference in Amsterdam where it was first presented. It underlines the importance of understanding and communicating the premises and practices of how data was collected (and made) and used in research for successful digital archiving, and the similar pertinence of documenting digital archiving processes to secure the keeping, preservation and effective reuse of digital archives possible.

References

Madry, S., Jansen, G., Murray, S., Jones, E., Willcoxon, L. and Alhashem, E. (2023) A Focus on the Future of our Tiny Piece of the Past: Digital Archiving of a Long-term Multi-participant Regional Project, Zenodo, 7967035, ver. 3 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7967035

A Focus on the Future of our Tiny Piece of the Past: Digital Archiving of a Long-term Multi-participant Regional ProjectScott Madry, Gregory Jansen, Seth Murray, Elizabeth Jones, Lia Willcoxon, Ebtihal Alhashem<p>This paper will consider the practical realities that have been encountered while seeking to create a usable Digital Archiving system of a long-term and multi-participant research project. &nbsp;The lead author has been involved in archaeologic...Computational archaeology, Environmental archaeology, Landscape archaeologyIsto Huvila2023-05-24 18:46:34 View
23 Nov 2023
article picture

Percolation Package - From script sharing to package publication

Sharing Research Code in Archaeology

Recommended by ORCID_LOGO based on reviews by Thomas Rose, Joe Roe and 1 anonymous reviewer

​The paper “Percolation Package – From Script Sharing to Package Publication” by Sophie C. Schmidt and Simon Maddison (2023) describes the development of an R package designed to apply Percolation Analysis to archaeological spatial data. In an earlier publication, Maddison and Schmidt (2020) describe Percolation Analysis and provide case studies that demonstrate its usefulness at different spatial scales. In the current paper, the authors use their experience collaborating to develop the R package as part of a broader argument for the importance of code sharing to the research process. 

The paper begins by describing the development process of the R package, beginning with borrowing code from a geographer, refining it to fit archaeological case studies, and then collaborating to further refine and systematize the code into an R package that is more easily reusable by other researchers. As the review by Joe Roe noted, a strength of the paper is “presenting the development process as it actually happens rather than in an idealized form.” The authors also include a section about the lessons learned from their experience. 

Moving on from the anecdotal data of their own experience, the authors also explore code sharing practices in archaeology by briefly examining two datasets. One dataset comes from “open-archaeo” (https://open-archaeo.info/), an on-line list of open-source archaeological software maintained by Zack Batist. The other dataset includes articles published between 2018 and 2023 in the Journal of Computer Applications in Archaeology. Schmidt and Maddison find that these two datasets provide contrasting views of code sharing in archaeology: many of the resources in the open-archaeo list are housed on Github, lack persistent object identifiers, and many are not easily findable (other than through the open-archaeo list). Research software attached to the published articles, on the other hand, is more easily findable either as a supplement to the published article, or in a repository with a DOI.

The examination of code sharing in archaeology through these two datasets is preliminary and incomplete, but it does show that further research into archaeologists’ code-writing and code-sharing practices could be useful. Archaeologists often create software tools to facilitate their research, but how often?  How often is research software shared with published articles? How much attention is given to documentation or making the software usable for other researchers? What are best (or good) practices for sharing code to make it findable and usable? Schmidt and Maddison’s paper provides partial answers to these questions, but a more thorough study of code sharing in archaeology would be useful. Differences among journals in how often they publish articles with shared code, or the effects of age, gender, nationality, or context of employment on attitudes toward code sharing seem like obvious factors for a future study to consider.

Shared code that is easy to find and easy to use benefits the researchers who adopt code written by others, but code authors also have much to gain by sharing. Properly shared code becomes a citable research product, and the act of code sharing can lead to productive research collaborations, as Schmidt and Maddison describe from their own experience. The strength of this paper is the attention it brings to current code-sharing practices in archaeology. I hope the paper will also help improve code sharing in archaeology by inspiring more archaeologists to share their research code so other researchers can find and use (and cite) it. 

References

Maddison, M.S. and Schmidt, S.C. (2020). Percolation Analysis – Archaeological Applications at Widely Different Spatial Scales. Journal of Computer Applications in Archaeology, 3(1), p.269–287. https://doi.org/10.5334/jcaa.54 

Schmidt, S. C., and Maddison, M. S. (2023). Percolation Package - From script sharing to package publication, Zenodo, 7966497, ver. 3 peer-reviewed and recommended by Peer Community in Archaeology. https://doi.org/10.5281/zenodo.7966497

Percolation Package - From script sharing to package publicationSophie C Schmidt; Simon Maddison<p>In this paper we trace the development of an R-package starting with the adaptation of code from a different field, via scripts shared between colleagues, to a published package that is being successfully used by researchers world-wide. Our aim...Computational archaeologyJames Allison2023-05-24 15:40:15 View