249-ISC : Non destructive in situ strength assessment of concrete
Technical Committee 249-ISC
General InformationChair: Prof. Denys BREYSSE
Deputy Chair: Prof. Jean-paul BALAYSSAC
Activity starting in: 2012
Activity ending in: 2019
Context and scope
In place strength assessment of concrete is not a new topic. A RILEM TC (TC 43) had even been devoted, long time ago, to this issue. Its recommendations, which lead to the development of a model where many unknown coefficients had to be identified proved to be unpractical for on site concrete. However, the ideas developed by TC 43 have been the roots of many works.
In place concrete strength assessment is of major interest in various contexts: (a) for quality control of recently built structures, (b) for checking the performances of anexisting structure, (c) when a structure needs retrofitting (because of a new use, an extended service life or more severe requirements). In all these cases, it would be of primary importance to have a widely recognized methodology in order to ascertain mean strength and variability of the material.
The development of many non destructive techniques, many of them enabling quick measurements at reduced cost, has opened ways for progress. A recent RILEM TC (INR 207) has shown how these techniques can be used for helping the engineer to better assess the material condition, the geometry of the structure, or the development of deterioration. Its results have been published recently (feb. 2012, Non-destructive assessment of concrete structures: state-of-the-art report of the RILEM technical committee 207-INR, Coord. D. Breysse, Ed. Springer) and was focused on defining the advantages and limits of NDT for solving few specific questions. The issue of concrete strength was estimated as a key problem. However the aim of the TC was not to prepare recommendation on this specific problem.
A standard (EN13791) has even been published which makes possible to derive strength values from NDT measurements (rebound, US velocity or pull out) but it remains of little use because it requires a number of control specimens (cores) too large for most practical situations and uses a very conservative approach.
Lastly, some countries have developed the use of NDT for concrete strength assessment on a large scale. It is particularly the case for Italy, where Law makes mandatory the seismic assessment of public buildings and where NDT methods, mainly the combination of rebound and ultrasonic measurements, are used by many research teams, public authorities and engineers. Other countries worldwide are also developing extensive studies in the same field.
The scientific and engineering community however lacks practical recommendations/guidelines on how to perform the full assessment, from the choice of the techniques and areas to investigate to the data processing (choice of a model, calibration process).
On the basis of recent scientific progress, it is also reasonable to think that a specific collective work would also make possible to quantify the material variability at the component and structural scales. The knowledge of this variability (and the confidence of the estimate) would be a decisive step for structural reliability assessment and the practical development of improved evaluations.
Terms of reference
Framework, difficulties and opportunities
The issue of concrete strength assessment is not a new question. In the RILEM history, TC43 addressed it many years ago. Its contribution remains a reference and was a starting point for the diffusion of the SonReb methodology, which combines rebound measurements and ultrasonic pulse velocity measurement for a better assessment. The RILEM recommendation proposed a model using a series of corrective factors (for aggregate type and size, cement type…). This recommendation appeared to be unrealistic in practice for on-site assessment of existing structures. However its principles were adapted and many studies have shown that the combination of NDT techniques in order to assess on site concrete strength was interesting. This issue however remains controversial since some studies show adverse results. In fact, the real advantages and limits of the combination had never been analyzed from a general point of view. Many data, both from laboratory studies and on site measurements have been accumulated during the last twenty years, which make now possible to address this question from a new and neutral point of view.
The problem is crucial since the need for NDT assessment is increasing with time, either because existing structures need to be assessed (for instance for seismic retrofitting), or because the remaining service life of ageing infrastructures (bridges, nuclear power plants…) must be estimated. In all these cases, strength assessment at various scales (local, component, structure) is required, both regarding average values and variability. The accuracy of the estimation must also be a target, since it will be used as a data in the reassessment process.
Most of the difficulty comes from the fact that, since guidelines and common recommendations are lacking, there is a wide variety of approaches, techniques, models… which may lead to apparent contradictory results. A recent European standard (EN 13791) opened the door towards a wider use of NDT, offering two ways for calibration (namely by statistical correlation or by using a basic curve and fitting), but it requires a number of cores which often appears to be too large in practice. In the same time, the increasing demand for efficient assessment of existing structures has induced the development of NDT use. For economical reasons, the number of cores used for calibration remains small, well below what is recommended in the Standards. It thus seems useful to prepare recommendations explaining what can be obtained in terms of accuracy, and defining what strategy can be followed, at what cost.
If one tries to look further, another issue is that of material variability. This issue is crucial because it impacts the quality of the assessment but also because variability can be considered as a basic information. Semi-probabilistic assessment (like in Eurocodes) is based on the use of Partial Safety Factors which are calibrated by accounting for material variability. Being able to estimate the material variability on site is a key challenge for the development of Eurocodes for existing structures (http://eurocodes.jrc.ec.europa.eu/showpage.php?id=631). Furthermore, such information is required for a full probabilistic assessment, like that recommended by the Model Code of the Joint Committee for Structural Safety (JCSS, http://www.jcss.byg.dtu.dk/upload/subsites/jcss/publications/pmc/part_iii.pdf). NDT appears to be a very good tool for estimating the variability of concrete properties, among which concrete strength.
The TC will aim at validating a methodology and writing guidelines defining how concrete strength of an existing structure can be assessed at best at a given scale (local, component, storey, structure).
TC members will be recruited on the basis of their past experience in the field of non destructive assessment of concrete. A specific attention will be devoted such that all parts of the TC-program will be adequately covered. Regional balance will be looked not only for equilibrium but also because emerging countries are very active in this field.
The TC members contribution may have various forms: participating to the State-of-the-Art analysis, taking part in the two round robin tests (on real data and on synthetic data), sharing data and being involved in the building of a shared database…
The time necessary for covering the program is estimated at four years.
Detailed working programme
1. State of the Art on practices: techniques and methodologies
NDT techniques are used in many countries for concrete strength assessment. Various techniques can be used for gathering useful information (either non destructive NDT - rebound, ultrasonic pulse velocity, resistivity, penetration… – or slightly destructive SD– pull-out, drilling…).
First, these methods will be reviewed, highlighting how they can contribute to strength assessment, and to what influencing factors they are sensitive. The representativity / reliability of the direct data (field measurement) and derived property (strength) will be estimated.
Second, the added-value of combining several techniques will be analyzed: what is the efficiency and what are the limits of such combination?
Third, at a larger scale, strength assessment of a construction requires a strategic approach which defines how many tests are required, where they must be performed and what must be the sequence of NDT tests and coring. Alternative approaches will be reviewed and analyzed
2. Comparison of models for data analysis and processing
Model tells how the derived property (strength) is predicted from direct measurement. A large variety of models exist. There is agreement neither on what is the best model nor on how to evaluate the quality of a model (for what purpose? At what scale?). The TC will review models and establish a comparison grid leading to criteria that can be used to compare their merits.
Another important question is that of calibration. All experts agree that NDT or SD always requires some calibration process, in order to derive strength values valid in each specific case. The issue of calibration induces many questions (what is the reference strength – laboratory? cores? -, how many cores are needed and what is the weight of statistical error? ...). Various calibration processes will be compared, which will lead to criteria that can be used to compare their merits.
3. Benchmark on real field data
The two first points will lead to choose a limited number of techniques and a few alternatives for strategic approach and calibration. The TC will elaborate a shared database of experimental data by gathering laboratory data and field data. This database will be given open access for TC members who will compare the efficiency of alternative approaches and models regarding strength assessment: how the strength assessment results vary with the approach and methodology? What accuracy can be expected? At what cost?
4. Benchmark on synthetic field data
Synthetic data (i.e. data created by simulation) offer many advantages. Since, once plausible physical laws have been introduced in the generation process, it is possible to simulate at low cost many variants regarding the strategic approach, the number and location of tests, the quality of tests… It is why synthetic simulations will be used by the TC in order to better analyze the efficiency of methods/approaches.
Several assessment problems may be simulated regarding:
- The type of structure: (a) an “homogeneous” building, (b) a structure with some spatial variability, like that of a bridge deck, (c) a “composite” building like a multi-storey building with different properties in beams and columns,
- The aim of assessment: local strength values, average properties, characteristic values…
After data generation, the NDT-ST tests will be simulated and they will be processed like on a real structure. The cost and merits of variants will be analyzed, with the objective of quantifying them and helping future decisions.
5. NDT inspection and knowledge updating
NDT can be performed in order to improve a yet existing knowledge. The way the new results can be used at best in the updating process will be identified and formalized.
6. Deliverables - Draft guidelines
The main objective of TC is to write such guidelines. The guidelines:
- Will define the possible objectives of strength assessment,
- Will review what techniques can be used, alone or in combination. For each technique, advantages and limits will be pointed. Influencing factors will be discussed and the accuracy of estimated strength will be quantified. Time and cost elements will be addressed,
- Will explain what types of models are recommended and how their quality can be quantified,
- Will explain what strategic approaches are recommended in order to address the possible objectives at various scales (local, component, storey, structure) and will compare their merits,
- Will explain how data must be processed, addressing the calibration process. It will explain how the accuracy of the assessment can be estimated (accounting for the many sources of uncertainty: material variability, measurement error, model error, statistical uncertainty),
- Will detail some practical examples, either on real data or on synthetic data, explaining in details how to proceed,
- Will write recommendations on how knowledge updating may be carried out when additional test results are made available.
In addition, the two databases (real field data and synthetic data) will be prepared for open-public access by RILEM members and a document will be written to explain how these data can be used.
The activity of this TC will contribute to RILEM work on non destructive techniques, as it was the case for the former TC 207-INR. Its field lies at the intersection of several clusters: testing, service life and deterioration. Links will be established with TC 230-PSC chaired by Dr. Hans Beushausen, which focuses on the specification and control of concrete cover for durability purposes.
The TC activity may also contribute to other organisms:
- in relation with the work undertaken for a future Eurocode on existing buildings,
- in relation with the JCSS Model Code (the TC Chair is also a member of the JCSS committee, which is a positive point for diffusion of RILEM works).
The TC will aim at writing guidelines defining how concrete strength of an existing structure can be assessed at best at a given scale (local, component, storey, structure).
Its expected results cover:
- A State-of-the-Art of practices, guidelines and standards on this topic
- Establishing recommendations on how to carry the investigation: With what techniques? Where? How NDT techniques can be combined? How measurement results can be processed? These recommendations must cover the issues of accuracy and cost.
- Explaining how additional information can be merged with yet existing knowledge, by updating it.
- Giving examples of good practices and pointing the limits
- Building a shared database for public access (this database would also be very useful for dissemination of knowledge towards developing countries, where it is often difficult to get validated experimental data).
Group of users
The TC work will be profitable to many users:
- End users and people in charge of building management and safety,
- Practitioners of NDT and structural assessment, which will rely on common guidelines,
- Research team which will gain access to a shared database and will be able to develop innovative methods or strategies.
Specific use of the results
The main impact will be the existence of common guidelines making possible the evaluation of strength of existing building, which is a key point for retrofitting and estimation of residual service life or safety level. The gain will be both on safety (better estimated) and money saving (with a better definition of required repair works).
- Prof. Jean-paul BALAYSSAC
- Prof. Samuele BIONDI
- Prof. Denys BREYSSE
- David CORBETT
- Dr. Arlindo GONÇALVES
- Prof. Michael GRANTHAM
- Dr. Vincenza A. M. LUPRANO
- Prof. Angelo MASI
- Dr. Zoubir Mehdi SBARTAÏ
- Dr. André VALENTE MONTEIRO