Loading…
CAA Conference has ended
Theoretical Approaches & Context of Archaeological Computing [clear filter]
Wednesday, March 28
 

9:00am BST

10 - Shape grammar modelling and the visualisation of an uncertain past
Despite the ubiquity of uncertainty at every level of archaeological data collection, analysis and interpretation, most 3D visualisations continue to convey to their audiences, as well as demand from their creators, a single concrete view of the past. Where uncertainty is conveyed it is typically in the form of visual cues that differentiate the certain from the postulated or tentative. While these may act as a qualification for the viewer, it is often at the expense of the aesthetic impact, simplicity and wholeness of the image. In this paper I will demonstrate a methodology for the visualisation of archaeological sites and buildings that incorporates the inherent uncertainty of archaeological interpretation using the procedural modelling technology of 'shape grammars'. Shape grammar modelling allows one to create transparent and adaptable parametric procedures that act as conceptual models of our interpretation of archaeological sites, using probabilistic functions to reflect interpretive uncertainty. These procedures can be used to automatically generate large numbers of diverse, stochastically determined architectural models that as a whole reflect the range of possible interpretations of a building. The strengths of this methodology will be explored using a case study from the site of Portus, Italy, a possible ship-shed dating from the 2nd Century AD.


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

2 - Handling Uncertain Information on Archaeological Sites - Lesson from the 3.11 Shock in Japan-
On 11 March 2011, the magnitude 9.0 earthquake and tsunami caused immense damage to the residents, social infrastructure, and also to cultural assets in coastal areas of northeast Japan. Since the so-called '3.11 shock', we have needed to confront various serious problems: sorting out cultural properties from debris, and conserving and preserving rescued finds, among the other things. In these conditions, the collective relocation of former residents of afflicted areas is also under discussion. Although most people who lived in the tsunami-affected area refuse to live in coastal areas, proposed relocation areas tend to overlap with areas known to have important archaeological sites. These conditions dramatically exposed the weakness of information infrastructures in Japanese archaeology. There are no integrated archaeological databases, which anyone can access and use freely, and this prevents analysis of the current situation. Even an accurate count of archaeological sites around the afflicted areas cannot be determined; archaeological sites registered by each local government and by the Nara National Research Institute for Cultural Properties (NNRICP) are different. Thus, it is difficult to grasp the scale of damage. The reason for this problem is derived from the definition of 'archaeological sites'. In Japanese archaeology, the three concepts of 'sites', 'architectural remains' and 'artifacts' are used for archaeological activities, and 'sites' are defined as places with evidence of human activities including either 'architectural remains', 'artifacts', or both. However, archaeological sites are actually managed in terms of survey units rather than archaeological units. For this reason, some local governments try to reintegrate or redivide survey units into archaeological units, and others do not. At the NNRICP, archaeological sites are managed based on survey reports, and their geographical locations are assigned coordinate points based on such reports. There is no way to count the true number of archaeological sites. Even so, we need to use this uncertain information to tackle current problems such as the collective relocation of sufferers. In this study, I attempt to analyse the current condition of afflicted areas using datasets, originally constructed by NNRICP, and now managed by the Consortium for Earthquake-Damaged Cultural Heritage (CEDACH). First, I verify the accuracy of the datasets, then analyse the distribution of archaeological site complexes without considering the three definitions of archaeological materials used in Japanese archaeology, visualising archaeological sites overlapped with the tsunami damage area, and creating a damage prediction map for the collective relocation of sufferer of the '3.11 shock'. All of these analyses were conducted with the R statistical package and GRASS GIS, and the processes and the source codes will be shown in this paper. Although the priority resides in securing a quick and safe relocation of the local population, the need to investigate the current situation of the archaeological heritage remains a critical issue. The 3.11 earthquake exposed the problems of the current digital management of the archaeological heritage and the importance of tackling the intrinsic uncertainty in existing databases. This alternative problem does not fit neatly into the response to the 3.11 shock in Japan.

Speakers
YF

Yu Fujimoto

Organization for Advanced Research and Education / Faculty of Culture and Information Science, Doshisha University


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

3 - Two techniques for the assessment of the positional uncertainty derived from the polygon-to-point collapse operation
One of the most resorted techniques for cartographic generalization is that of collapse, the operation by which a geometric feature is transformed into another lower-dimensional geometry. Issues in this respect are:Solving the location of the lower-dimensional geometry for each feature.Approximating the set of alternative spatial configurations that may be defined by the derived patterns as a consequence of the the spatial extent of the higher-dimensional geometries, since alternative locations for the lower-dimensional features will hence result in different spatial patterns.From the analytical viewpoint, the effect of this variability on the signal of a measure that uses the spatial structure of the lower-dimensional primitive depends on the size of the window of analysis and on the size and shape of the higher-dimensional geometries, as the interrelation between both conditions makes the extent of the positional uncertainty be resized with respect to the extent of the window of analysis.Such a cartographic problem is typical in landscape studies involving area features. In our study case disconnected scatters of lithic artefacts recorded on the terrain surface are modelled as polygon objects and these in turn collapsed into point objects located at the mean centres of the polygons. To assess the degree of positional uncertainty of the data set caused by the collapse operation, under the spatial domain of the specified window of analysis W, two exploratory techniques are applied:The first uses the length of the links that connect all possible pairs of points in a point realization. For each alternative point configuration of collapsed polygons a_i, a_j,... , the list of interpoint distances t_xy is measured for every pair of points {x,y}. Then for each pair {a_i,a_j} the absolute difference between each t_xy(a_i) in a_i and its equivalent t_xy(a_j) in a_j is computed. The resultant differences are next standardized by dividing them by the square root of W (in order that the standardization result from measures with the same dimensions). The frequency distribution of the ratio signals for each {a_i,a_j} can finally be summarized. In this respect the median and extreme quantiles provide robust measures of the interpoint distance discrepancy signals.The second involves the transformation of the point pattern into a scalar model and the use of measures of association. Now, for each a_i, a_j,… , the density surface of each a, say alpha, is estimated and structured in a grid dataset. Next, for each pair {alpha_i,alpha_j} the bivariate association of the scalar values at colocalized grid cells is estimated. In this regard the following techniques have been applied:Bivariate plotting of colocalized cells. Individual plots can be accumulated in order to jointly explore the dispersion of the density patterns in the sample of alternative point realizations.The Spearman's rank correlation coefficient, as its improves robustness against bias generated by non-Gaussian density distributions. Evaluation of significance should take account of autocorrelation processes and be associated to specific support sizes (due to the sample size variability introduced by the modifiable blocking of the grid).


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

4 - Map Digitisations: Methodological Foundations, Uncertainties, and Error Margins at the example of the Gough Map
Historic maps, while not necessarily spatially accurate, function to convey to the researcher a sense of place - they capture locations as they presented themselves at the time of map production and, maybe more importantly, within the conceptual framework of the mapmaker's perception. The Gough Map"”drawn around 1370 and later partly redrawn and otherwise amended"”is an ideal subject for the study of digitisation techniques. Not only has it been reprinted and redrawn several times, its features were digitised twice, in 2005 and 2010, offering the ability to compare advances in a fast-moving field.



At the core of the paper lies a number of sets of comparative digitisations based on the author's own photos of the Gough map; which offer the ability to directly evaluate the accuracy of areal calculations and distance measurements using raster and vector based digitisations of the same subject, different preparations of the source images (including image filters to bring out features of interest), different conceptual approaches to uncertainty (rather than drawing only the features that the researcher is certain they have identified, marking features as "˜certain' "˜likely' and "˜possible' and applying the same scrutiny in secondary research to their potential identification). This is particularly relevant to the Gough map, which shows a recognisable, but incomplete network of distance lines (which may or may not be a map of roads"”this is disputed in the literature) as well as a number of fainter lines which are not usually rendered as "˜roads' in reproductions but which might nonetheless be of cartographic significance.



The second part of this paper thus seeks to integrate the knowledge gained through digitisation with topographic information (where the modern topographic map is adjusted to accommodate key points of the historical one and vice versa, again examining the advantages and disadvantages for each technique employed) as much as placing it in the context of medieval geographies, in other words the known roads, rivers, and important settlements which would have shaped the mapmaker's image of his world.

Speakers

Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

5 - Reliability of the representation of a distribution: a case-study on Middle Bronze Age metal finds in the Seine valley.
The Workshop 3 of the research programme ArchaeDyn II studies the mode of products diffusion between the sources of raw materials, places of manufacture and places of consumption. It takes benefit from the modellization of the changing of space and from the establishing qualitative and quantitative indicators permitting to measure, by chronological phases, the spatial dynamics. In this aim it uses several pre-existing databases concerning essentially the period from the Neolithic to the Bronze Age. The fact that these data has been collected previously in order to meet other specific archaeological issues brings up the necessity to control the state of knowledge of the studied areas and the possible biases in the distributions to allow their correct analysis and the interpretation of the results. The circumstances of discoveries, the state of study, the preservation of vestiges, the accessibility of data and the choices made at the moment of the constitution of the corpus are very diversified concerning the different portions of space. This implies a high heterogeneity of data and a need for an evaluation of their "reliability" vis-à-vis the archaeological "reality". What are the factors that can create under- and over-representation in the distribution or can pose difficulties for the interpretation of the spatial analyses? Do these factors disturb or affect too profoundly the vision that we dispose of the "real" repartition of the studied objects? Can we finally say that the corpus is sufficiently reliable to realize a spatial study of the circulation and consummation of products? Here, we present a case-study on the so-called Normand type palstave axes in the lower and middle Seine valley. This type was produced in series from 15th c. to 14th c. B.C. . The axes found in the lower and middle Seine valley show different levels of finishing, made after the axe was taken out from the mould. The majority of the app. 400 axes currently inventoried in the spatially referenced database was studied macroscopically. This dataset was involved in the analyses of ArchaeDyn's Workshop 3 to examine whether the level of finishing depends or not on their distance from the production centres. This corpus is particularly interesting for the elaboration of a method for the estimation of the reliability of distributions because it has some particularities: the metal finds are essentially known from hoards and isolated finds. The majority came from fortuitous discoveries so the corpus and the repartition of the data are above all influenced by the finds made at the occasion of agricultural activities, constructions, development work and exploitation of natural resources (dredging…). From a comparison of data and discovery's possibility (reliability map), we propose to examine the impact of the factor of the "conditions of discovery" on the archaeological data and we are going to see whether, in the present state of the research, it is possible to control it.

Speakers
EG

Estelle Gauthier

University of Franche-Comté, MSHE C.-N. Ledoux, Besançon, France


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

6 - Temporal Uncertainty and Artefact Chronologies
This presentation will address an important feature of archaeological analysis: how we manage the uncertainty associated with our traditional relative dating of archaeological artefacts. It emphasises the need to develop more explicitly probabilistic methods for assigning artefacts to particular chronological periods, and as a case, study, draws upon some 14,000 potsherds from an intensive surface survey of the Greek island of Antikythera that have been treated with these issues in mind from the outset. I consider several statistical methods that can be useful for understanding how the uncertainty associated with one period of time may be shared with another and how these shared uncertainties propagate into our spatial analysis and/or subsequent interpretation. Mapping such uncertainty in a vairety of ways can not only allows us to evelop more sophisticated interpretations, given the present state of our knowledge, but also allow us to identify where further fieldwork or object re-study can most profitably by invested. Finally, many of these challenges have implications for field practice, and in particular for the type of excavations or survey strategies we might wish to promote, and I will also reflect on these wider methodological debates.

Speakers

Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

7 - Exploring probable histories: applying Monte-Carlo methods for uncertainties in spatial and temporal analysis.
One of the biggest burdens carried by archaeologists is the ubiquity of uncertainty in most aspects of the discipline. Chronometry, sampling biases and the indirect inference of past human behaviour are perhaps the most relevant examples. While some theoretical discussion on the implication of these problems has been addressed in the past, practical solutions have been relatively few. Uncertainty is in fact often measured and acknowledged, but a direct integration of its implications in the analytical domain has been rare. The treatment of uncertainty can be generally distinguished in two main stages. The first part involves its quantification, often involving the adoption of a probabilistic description of the empirical data. The second stage is characterised by the formal and quantitative integration of such knowledge into the analytical workflow, which will ideally translate into outputs where quantified patterns can be classified according to degrees of uncertainty. Such outcome could then be an integral part of the archaeological narrative, providing a framework for distinguishing what we know for sure from what we are less certain about. This paper will tackle the second aspect, focusing on case studies of spatial and temporal analysis. It will explore how probabilistic data obtained from a wide variety of methods (e.g. radiocarbon dating, aoristic analysis, probabilistic categorisation etc.) can be integrated to analytical workflows that are traditionally not designed to deal with such type of input. The proposed solution is to fully exploit the available information by using Monte-Carlo Simulation methods. Results of such technique will offer a distribution of possible alternative histories that can be inferred from the current state of knowledge. Statistical summary of such output will then provide the likelihood of occurrence for each alternative event. Applications of the proposed method will be shown for three different case studies: 1) inferences on populations dynamics based on pithouse counts; 2) spatio-temporal variation in settlement density; and 3) analysis of the settlement rank size distribution.

Speakers
EC

Enrico Crema

Institute of Archaeology, UCL


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

8 - Quantifying the Obvious: communicating uncertainty in the geochemical provenance of archaeological ceramics
Recently archaeological materials analysis has focused on the development and application of new techniques, such as handheld x-ray fluorescence (XRF) analysers, for the geochemical provenance of artefacts (e.g. Goren et al. 2011). While these studies are important for the continued growth of the discipline and stimulate valuable discussion about quantitative, semi-quantitative, quasi-quantitative and qualitative data sets (e.g. Shackley 2010), they ignore the underlying uncertainties inherent in the data itself. Measuring the elemental composition of ceramic artefacts with the highest degree of accuracy and precision, to the lowest detection limits possible, using either quantitative or semi-quantitative instruments is only valuable for establishing artefact provenance if that chemical profile actually provides information about provenance. The premise of ceramic provenance using geochemical analyses is that raw material signatures are chemically distinct and that those chemical signatures can be detected in ceramic fabrics. Sediments are generally not directly suitable for potting and must first be processed and refined, changing their mineralogical and chemical signature (Rice 1987: 118-119). What is measured then, in chemical provenance studies, is the chemical profile of a ceramic fabric, which may or may not be related chemically to the raw sediment from which it is composed (e.g. Hein et al. 2004). The degree to which ceramic fabrics reflect their raw materials provenance is a level of uncertainty which is often overlooked when communicating results of these studies. An additional level of uncertainty exists related to the chemical homogeneity of the ceramic raw materials themselves. Geochemical variability in the natural world is limited: a finite number of elements bond in predictable ways to crystallise a finite number of minerals, and those minerals combine to form a predicable and finite number of rock types. Like crystallisation, weathering of rocks follows an established trajectory so that the chemical and mineralogical heterogeneity of detrital sediments is limited further still. Sediments from different geographic locations or provenance can have the same chemical signature (e.g. Klein and Langmuir 1989). This geochemical homogeneity is another level of uncertainty which is often ignored in ceramic provenance studies. These layers of uncertainty and failure to effectively communicate them in geochemical provenance studies impact the larger archaeological narrative through the misidentification of ceramic provenance upon which social, economic and political theories and relative chronologies are based. This paper evaluates uncertainty in chemical provenance studies of archeological ceramics related to human behaviour and natural geological homogeneity and proposes new vocabulary for communicating this uncertainty within the wider archaeological community. Goren, Y., Mommsen, H., Klinger, J., 2011. Journal of Archaeological Science 38, 684-696. Hein, A., Day, P.M., Quinn, P.S., Kilikoglou, V., 2004. Archaeometry 46, 357-384. Klein, E.M., Langmuir, C.H., 1989. Journal of Geophysical Research 94, 4241-4252. Rice, P.M., 1987. Pottery Analysis, University of Chicago Press, Chicago. Shackley, M.S., 2010. The SAA Archaeological Record November, 17-20.

Speakers
AH

Alice Hunt

UCL Institute of ArchaeologyResearch CollaboratorDepartment of AnthropologySmithsonian Institution NMNH Academia: http://ucl.academia.edu/AliceHunt


Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167

9:00am BST

9 - Embracing Uncertainty and The London Charter: Case Studies with the 3D Restoration of Ancient Sculpture
The London Charter is an international initiative to define best practice in the computer visualization of cultural heritage (http://www.londoncharter.org/). Among the principles embodied in the Charter is the need to publish the "paradata" along with the digital visualization of an archaeological artifact or monument. Paradata is defined as "information about human processes of understanding and interpretation of data objects" (http://www.londoncharter.org/glossary.html). In the case of 3D digital models that restore or reconstruct lost or damaged artifacts, the paradata often focuses on elements that are uncertain and how the uncertainty was defined and resolved. In this paper, I will present two case studies drawn from recent work by the Virtual World Heritage Laboratory: the 3D data capture, modeling, and restoration of the portraits of the Greek philosopher Epicurus and the Roman emperor Caligula (the latter supported by NEH grant # RZ-51221). In both projects, imperfectly preserved ancient originals had to be reconstructed on the basis of scattered evidence pertaining to the size and position of the head; the position and gesture of the arms; and the color used on the surface. In the case of Caligula, three variants were created, all equally probable, reflecting the input of an international scholarly team of archaeologists and conservators. In the case of Epicurus, twelve variants were generated reflecting the understanding of three specialists on Greek portraiture.

Speakers

Wednesday March 28, 2012 9:00am - 1:15pm BST
Building 65, 1143 Streamed into room 1167
 


Filter sessions
Apply filters to sessions.
  • Data Analysis Management Integration & Visualisation
  • Data Analysis, Management, Integration & Visualisation
  • Data Modelling & Sharing
  • Field & Lab Recording
  • Geospatial Technologies
  • Human Computer Interaction Multimedia Museums
  • Human Computer Interaction, Multimedia, Museums
  • Other Events
  • Personal Histories
  • Plenary
  • Poster
  • Simulating the Past
  • Spatial Analysis
  • Theoretical Approaches & Context of Archaeological Computing
  • Workshop