Search

found 8 results

Research papers, University of Canterbury Library

Background This study examines the performance of site response analysis via nonlinear total-stress 1D wave-propagation for modelling site effects in physics-based ground motion simulations of the 2010-2011 Canterbury, New Zealand earthquake sequence. This approach allows for explicit modeling of 3D ground motion phenomena at the regional scale, as well as detailed nonlinear site effects at the local scale. The approach is compared to a more commonly used empirical VS30 (30 m time-averaged shear wave velocity)-based method for computing site amplification as proposed by Graves and Pitarka (2010, 2015), and to empirical ground motion prediction via a ground motion model (GMM).

Research papers, University of Canterbury Library

New Zealand has a long tradition of using light timber frame for construction of its domestic dwellings. After the most recent earthquakes (e.g. Canterbury earthquakes sequence), wooden residential houses showed satisfactory life safety performance. However, poor performance was reported in terms of their seismic resilience. Although numerous innovative methods to mitigate damage have been introduced to the New Zealand community in order to improve wooden house performance, these retrofit options have not been readily taken up. The low number of retrofitted wooden-framed houses leads to questions about whether homeowners are aware of the necessity of seismic retrofitting their houses to achieve a satisfactory seismic performance. This study aims to explore different retrofit technologies that can be applied to wooden-framed houses in Wellington, taking into account the need of homeowners to understand the risk, likelihood and extent of damage expected after an event. A survey will be conducted in Wellington about perceptions of homeowners towards the expected performance of their wooden-framed houses. The survey questions were designed to gain an understanding of homeowners' levels of safety and awareness of possible damage after a seismic event. Afterwards, a structural review of a sample of the houses will be undertaken to identify common features and detail potential seismic concerns. The findings will break down barriers to making improvements in the performance of wooden-framed houses and lead to enhancements in the confidence of homeowners in the event of future seismic activity. This will result in increased understanding and contribute towards an accessible knowledge base, which will possibly increase significantly the use of these technologies and avoid unnecessary economic and social costs after a seismic event.

Research papers, University of Canterbury Library

Semi-empirical models based on in-situ geotechnical tests have become the standard of practice for predicting soil liquefaction. Since the inception of the “simplified” cyclic-stress model in 1971, variants based on various in-situ tests have been developed, including the Cone Penetration Test (CPT). More recently, prediction models based soley on remotely-sensed data were developed. Similar to systems that provide automated content on earthquake impacts, these “geospatial” models aim to predict liquefaction for rapid response and loss estimation using readily-available data. This data includes (i) common ground-motion intensity measures (e.g., PGA), which can either be provided in near-real-time following an earthquake, or predicted for a future event; and (ii) geospatial parameters derived from digital elevation models, which are used to infer characteristics of the subsurface relevent to liquefaction. However, the predictive capabilities of geospatial and geotechnical models have not been directly compared, which could elucidate techniques for improving the geospatial models, and which would provide a baseline for measuring improvements. Accordingly, this study assesses the realtive efficacy of liquefaction models based on geospatial vs. CPT data using 9,908 case-studies from the 2010-2016 Canterbury earthquakes. While the top-performing models are CPT-based, the geospatial models perform relatively well given their simplicity and low cost. Although further research is needed (e.g., to improve upon the performance of current models), the findings of this study suggest that geospatial models have the potential to provide valuable first-order predictions of liquefaction occurence and consequence. Towards this end, performance assessments of geospatial vs. geotechnical models are ongoing for more than 20 additional global earthquakes.

Research papers, University of Canterbury Library

Disasters are rare events with major consequences; yet comparatively little is known about managing employee needs in disaster situations. Based on case studies of four organisations following the devastating earthquakes of 2010 - 2011 in Christchurch, New Zealand, this paper presents a framework using redefined notions of employee needs and expectations, and charting the ways in which these influence organisational recovery and performance. Analysis of in-depth interview data from 47 respondents in four organisations highlighted the evolving nature of employee needs and the crucial role of middle management leadership in mitigating the effects of disasters. The findings have counterintuitive implications for human resource functions in a disaster, suggesting that organisational justice forms a central framework for managing organisational responses to support and engage employees for promoting business recovery.

Research papers, University of Canterbury Library

The 2010-2011 Canterbury earthquake sequence, and the resulting extensive data sets on damaged buildings that have been collected, provide a unique opportunity to exercise and evaluate previously published seismic performance assessment procedures. This poster provides an overview of the authors’ methodology to perform evaluations with two such assessment procedures, namely the P-58 guidelines and the REDi Rating System. P-58, produced by the Federal Emergency Management Agency (FEMA) in the United States, aims to facilitate risk assessment and decision-making by quantifying earthquake ground shaking, structural demands, component damage and resulting consequences in a logical framework. The REDi framework, developed by the engineering firm ARUP, aids stakeholders in implementing resilience-based earthquake design. Preliminary results from the evaluations are presented. These have the potential to provide insights on the ability of the assessment procedures to predict impacts using “real-world” data. However, further work remains to critically analyse these results and to broaden the scope of buildings studied and of impacts predicted.

Research papers, University of Canterbury Library

Natural catastrophes are increasing worldwide. They are becoming more frequent but also more severe and impactful on our built environment leading to extensive damage and losses. Earthquake events account for the smallest part of natural events; nevertheless seismic damage led to the most fatalities and significant losses over the period 1981-2016 (Munich Re). Damage prediction is helpful for emergency management and the development of earthquake risk mitigation projects. Recent design efforts focused on the application of performance-based design engineering where damage estimation methodologies use fragility and vulnerability functions. However, the approach does not explicitly specify the essential criteria leading to economic losses. There is thus a need for an improved methodology that finds the critical building elements related to significant losses. The here presented methodology uses data science techniques to identify key building features that contribute to the bulk of losses. It uses empirical data collected on site during earthquake reconnaissance mission to train a machine learning model that can further be used for the estimation of building damage post-earthquake. The first model is developed for Christchurch. Empirical building damage data from the 2010-2011 earthquake events is analysed to find the building features that contributed the most to damage. Once processed, the data is used to train a machine-learning model that can be applied to estimate losses in future earthquake events.

Research papers, University of Canterbury Library

Validation is an essential step to assess the applicability of simulated ground motions for utilization in engineering practice, and a comprehensive analysis should include both simple intensity measures (PGA, SA, etc), as well as the seismic response of a range of complex systems obtained by response history analysis. In order to enable a spectrum of complex structural systems to be considered in systematic validation of ground motion simulations in a routine fashion, an automated workflow was developed. Such a workflow enables validation of simulated ground motions in terms of different complex model responses by considering various ground motion sets and different ground motion simulation methods. The automated workflow converts the complex validation process into a routine one by providing a platform to perform the validation process promptly as a built-in process of simulation post-processing. As a case study, validation of simulated ground motions was investigated via the automated workflow by comparing the dynamic responses of three steel special moment frame (SMRF) subjected to the 40 observed and 40 simulated ground motions of 22 February 2011 Christchurch earthquake. The seismic responses of the structures are principally quantified via the peak floor acceleration and maximum inter-storey drift ratio. Overall, the results indicate a general agreement in seismic demands obtained using the recorded and simulated ensembles of ground motions and provide further evidence that simulated ground motions can be used in code-based structural performance assessments in-place of, or in combination with, ensembles of recorded ground motions.

Research Papers, Lincoln University

The increase in urban population has required cities to rethink their strategies for minimising greenhouse gas impacts and adapting to climate change. While urban design and planning policy have been guided by principles such as walkability (to reduce the dependence on cars) and green infrastructure (to enhance the quality of open spaces to support conservation and human values), there have been conflicting views on what spatial strategies will best prepare cities for a challenging future. Researchers supporting compact cities based upon public Transit Oriented Development have claimed that walkability, higher density and mixed-uses make cities more sustainable (Owen, 2009) and that, while green spaces in cities are necessary, they are dull in comparison with shopfronts and street vendors (Speck, 2012, p 250). Other researchers claim that green infrastructure is fundamental to improving urban sustainability and attracting public space users with improved urban comfort, consequently encouraging walkability (Pitman and Ely, 2013). Landscape architects tend to assume that ‘the greener the better’; however, the efficiency of urban greenery in relation to urban comfort and urbanity depends on its density, distribution and the services provided. Green infrastructure can take many forms (from urban forests to street trees) and provide varied services (amended microclimate, aesthetics, ecology and so forth). In this paper, we evaluate the relevance of current policy in Christchurch regarding both best practice in green infrastructure and urban comfort (Tavares, 2015). We focus on the Christchurch Blueprint for rebuilding the central city, and critically examine the post-earthquake paths the city is following regarding its green and grey infrastructures and the resulting urban environment. We discuss the performance and appropriateness of the current Blueprint in post-earthquake Christchurch, particularly as it relates to the challenges that climate change is creating for cities worldwide.