SEG Exhibition & Annual Meeting 2010 in Denver
I will present a talk in the Session ACQ 2 -- Survey Design and Marine OBS,Wednesday, October 20, 2010; Room: 103/105.
Investigation and optimization of OBC sensor array coupling to the seafloor
The most important challenge for the oil-industry is to increase the recovery rates for existing fields and to map fluid movements with time-laps 4D seismic, reducing geo-hazard like cap-rock integrity, subsidence or to monitor CO2 storage in an offshore reservoir by using active and passive sources (e.g. Airgun-survey and passive/microseismic). The seismic equipment is configured as sensor lines with cables trenched and covered on the sea-bottom.
The equipment typically comprises up to 4000 sensor nodes depending on the aerial extent of the reservoir. The system will preferably be connected directly to onshore operation centers by means of broadband communication and data can be controlled and processed for QA purposes in real time. Traditional 4D seismic techniques re-shoot a specific reservoir using conventional towed streamer techniques, trying to map and assess changes in the reservoir by having a repeat interval of 2-3 years.
I started in September 2010 a new research project in cooperation with Octio Geophysical AS and the University of Bergen.
There will be two main objectives for this project. The first will be a systematic investigation of OBC coupling to the seafloor for a better understanding of the horizontal coupling mechanism to increase the signal quality for 4D seismic, micro/passive seismic and fluid flow in a reservoir by injection of water or super critical CO2.
The second objective is to design a new OBC sensor-housing (node) to improve the coupling and a test on a real reservoir in cooperation with a costumer or in an addition research project.
In order to improve sensor coupling in the offshore environment novel and systematic approaches are needed. Today several approaches exist for he vertical component of a three component sensor. But there are no systematic investigations for the horizontal components. Most of the vertical approaches use special pre-processing methods like separation of upgoing and downgoing P and S wave-fields or inverse filtering.
But to understand the sensor coupling to the seafloor it is essential to investigate the sensor coupling itself. or a systematic investigation it is substantial to understand the connection f different sensor-housing designs to variable seafloors. Which design makes the sensor coupling better and why? What influence has the seafloor oil conditions and how deep should an OBC be trenched and how strong is he cable-tension influence? Why is the sensor coupling different for all three components?
All these questions will be investigated by using simulations and doing tests in a water-tank. The water tank will be hanged to the floor to decouple the water-tank from the ambient noise.
The most critical challenges will be to simulate the correct soil coupling ith the sensor-housing for horizontal movements, because current models its not into measurements and can only be sed as a start model. To find the optimal mechanical node design is also a critical challenge, because of to integrate the new design into the offshore installation procedures.
EAGE Annual Conference & Exhibition 2010 in Barcelona
From the 14. to the 17. June I have attended on the Annual Meeting in Barcelona.
I will summarize my talks on the conference presentations and the workshop. The focus on the workshop was Reservoir-Simulation and was organized as a round-table with discussions, work groups and short presentation of the results. My foci on the technical sessions were CSEM, microseismic and seismic attenuation.
I will start the summary with the workshop “Conditioning Reservoir Models to Dynamic Data” which took place at the 14.06.2010.
The goal of this workshop was to improve reservoir simulation with more predictable results and more precise geo-modeling. We discussed several methods to combine all available data and feed-back mechanism such as: Defining objectives of forecast with degree of interaction for a better production profile, optimal treatment of data volume, methods to make history key parameters available, production of better estimation techniques to combine all data into a geo-model and development of more advanced back-propagation methods.
One session in the workshop was a discussion about automatic history matching and its limits. The goal was to discuss possibilities to reduce the risk of misguidance or wrong forecast and to save money and time. Reducing the risk is more important than optimizing or increasing oil recovery.
Geo-modeling and its uncertainties was also an interesting session with the objective to generate dynamic data to simulate a reservoir more accurate. The classical streamline process is designed from data processing, analysis and interpretation to a geo-model and the reservoir simulation is leading to the final evaluation of an improved production profile. The interesting point is that many feed-back loops exist between the geo-model, reservoir-simulation and all types of data.
The interaction between feed-back loops can be misleading and to clearly identify all feed-back loops is difficult at best.
Each feed-back loop and its interaction increases the uncertainties of the geo-model and may be solved by conditioning the reservoir model. On the traditional way of conditioning the reservoir model dynamic-data are used with “history matching” as guidance.
The main problems with conditioning the reservoir can be summarized to: There is no unique solution and the simulation does not only relay on the history match but also on the forecast.
The last session I describe here is the discussion of: “What future 4D seismic techniques improvement must be implemented?”
All attendees agreed in following important improvements (96%): Permanent reservoir monitoring systems should be installed as soon as oil production starts. This is important to get historical data as soon as possible. The 4D seismic processing turnaround time should be less than three months with highly accurate data.
4D seismic often provides information about the progress of an injection fluid front, the increase of aquifers, the expansion of gas caps and the distribution of reservoir compaction. The main problem concerning 4D seismic is to provide highly accurate information with high vertical resolution. The vertical resolution is typically at the scale of tens of meters which causes important uncertainties in the knowledge of the reservoir connectivity and heterogeneity.
BP has run a feasibility survey on Valhall with very good results. They used a permanent reservoir monitoring system and they think that this system represents the future in reservoir modeling and will help to improve the production profile.
I will start with the CSEM sessions and the main interest, beside the exploration, the focus is now on the integration of CSEM, seismic and well data. Some presentations were addressing the possibility of time-laps CSEM and how to process the data and presenting promising results. Only simulations and some university tests have been conducted so far and further investigation needs to be made.
In the coffee breaks I discussed the possibility to install permanent EM systems to the seafloor with some others attendees (from BP, Schlumberger, Shell, Uni Delft and Scrips…) and we all agreed that permanent installation has the same advantage compare to seismic. One open question still remains: How can we calculate receiver spacing more accurately to illuminate the reservoir depletion?
TEM and FEM technologies are under lively discussion (e.g. air waves overlaying the signals in the frequency domain, but not in the time domain). Which one is the best method to process the data more precisely and with less processing-time? Also the discussion to use inline only or inline and crossline EM is still up for discussion.
A presentation from Statoil pointed out that CSEM results are seen to as imperfect Value-Of-Information for the asset teams and that the uncertainties are risky at best.
On the CSEM hardware side some new concepts were presented. Multi-source and multi receiver will be used in the future to illuminate the reservoir in 3D. One presentation from the Scrips institute discussed a feasibility study.
The passive and microseismic session provided nothing new since the last EAGE. They are still arguing that the H/V ratio could be an indicator for HC, but there has been no prove presented yet. The mechanism behind the 2-6 Hz spike in the frequency spectrum above a reservoir remains uninvestigated and only weak indications were presented (more or less the same as the last two years and they are still questionable). In order to improve microseismic events the velocity model has to be more precise. Some presentations were discussing the possibility to use a combination of vertical and horizontal measurements and shots to increase the accuracy of the velocity model.
The seismic attenuation sessions presented a new to me which it still is, so that I cannot say much about it. The problem with seismic attenuation is that the attenuation increases with depth and frequency. The attenuation can also cause phase/time-shifts in the data which makes picking of P- and S-waves more difficult.
One standard technique to resolve this is the inverse Q-filtering trying to compensate the attenuation. The problem is to get a proper Q estimation. One way to estimate Q is to calculate the spectral ratio from the inline and crossline component of a sensor.
EAGE CO2 Storage workshop Berlin 2010
From 11.03.2010 to 12.03.2010 I was on the CO2 storage workshop in Berlin, Germany. The main focus of the workshop was to present some results from simulations and case studies from existing demo projects.
My focus was to investigate the current status for reservoir-monitoring, what type of monitoring is preferred and how big is the interest to install reservoir monitoring system above a CO2 reservoir.
The workshop was organized like a “round-table” to share experiences and to discuss further opportunities. Some invited speakers presented feasibility studies and cases. This was used as discussion basis.
The European CO2 storage projects are mostly coordinated by the ZEP task force (Zero Emission Platform; www.zeroemissionsplatform.eu ) and the EU. The agenda to start commercial CO2 storage facilities is planned in three steps. The first step was the feasibility study and is mostly finished. The second phase is the setup from max. 7 demo facilities to investigate possible risks and to make the proof-of-concept. This will be finished 2015. Several onshore and offshore tests are started or will be started soon in Europe with different results so far, but mainly with acceptable risks. The third phase involves commercial prototypes and shell be finished 2020.
One topic on the round table was to evaluate the biggest problems for CO2 storage. After the discussion we summarized the problem in 3 topics in order of importance:
- Public problems: The acceptance of installing or using of CO2 storage onshore or offshore in politics and social community is uncertain at best. For the local politicians the decision to inject CO2 in the election area is a very “hot-potato” and they avoid to make a decision in their election-period. The social community can be summarized to “not in my back-yard” and a lot of local groups and media trying to force fear about the CO2 injection. Even if there already exists a gas-injection reservoir.
- Financial problems: To run a CO2 reservoir cost-efficient it is important to bring the injection costs down to acceptable limits for the industry. To be of interest to the industry the costs should be less than the CCS-license. This is nearly impossible (see summary below).
- Technical problems: Most of the technical problems discussed in the workshop are related to the uncertainties in and around a reservoir. Even if the reservoir information from oil/gas fields exists, there will be no information available of the cap-rock itself. The oil/gas industry is only interested in the reservoir, not in the cap-rock.
In addition following three unknowns exist:
- Reservoir capacity to store CO2: This is unknown because of the different behavior compared with an oil/gas reservoir
- Injectivity: CO2 injection rate and max pressure without destroying the reservoir or installations
- Containment: Long time storage at least 1000 years at best without monitoring. No one knows what happens with the society in so long time period (back to the stone-age scenario).
Finally I can summarize:
Beside of the discussion: if the CO2 storage is really necessary the CO2 storage-project is contaminated with a lot of uncertainties. It looks like that all problems can be solved but not in the EU time scale. To explain the relative complex solutions to the politicians and the social community will be difficult at best, but is extremely important to go further with the CO2 storage-project.
For now it is relatively unclear that CO2 storage will reach industrial acceptance. The oil and gas industry, power plant owners and heavy productions companies are only interested if the price of CO2 storage is less than the CCS license. A reasonable CO2 storage prize will be between 100€ to 200€ per ton and the CCS license will reach maximum 50€. It is most unlikely that the industry will follow this project. Right now there is also no marketing value in CO2 storage so the interest of the industry might be extremely low.
My personal opinion is that the climate change is not necessarily caused by CO2. There exists no proven fact to support this theory. It looks like as if we have a climate warming, but this does not mean that this is caused by CO2 or that it is man-made. I can name a long list of arguments against it, but this is also not helpful to find the correct answer. First of all we need hard facts without opinions and believes. We are far away to understand the earth's behavior and in order to get a better understanding we need clear investigation and answers without the pressure of politics and social communities.
Simulations and models can only be the beginning of a research and a guide to the correct question, but should not be used for decisions we have to live with for a long time. I think the risk to leave the CO2 where it is is much lower than to store it in a reservoir!
SEG 2009 Houston Annual Meeting
From the 24. October to the 31. October I was on the Annual Meeting in Houston.
On this convention my main foci were set to CSEM and Passive-/Microseismic. I visited several sessions and one lecture course in CSEM.
On the Passive/Microseismic sessions two case-studies were presented using Permanent-Reservoir-Monitoring Systems (PRMS) to measure micro-cracks in the offshore environment and proves my idea to use PRMS not only for active seismic surveys. The main problem is related to the soft sediment layer and high noise-floor on the sea bottom. The magnitude of micro fractures in the reservoir have very low amplitudes (ca. Mw = -2) induced by pore-pressure changes. But with several hundred nodes in a PRMS and the possibility to stack the data the mircro cracks can be located!
The CSEM technique is after the down-turn coming back to the marked. But not only for exploration. I have seen a growing interest in permanent offshore systems to monitor a reservoir combined with seismic. Besides some technical issues the main problem to use CSEM for monitoring is the data processing. 3D data-processing is a time-consuming operation independent of the computer power. The processing time has to be reduced to make time-lag surveys meaningful.
GeoPMS included in the Softpedia Mac OS software database
and new release to Version 0.14
"GeoPMS, has been added to Softpedia's database of software programs for Mac OS. It is featured with a description text, screenshots, download links and technical details on this page: http://mac.softpedia.com/get/Math-Scientific/GeoPMS.shtml
GeoPMS has been tested in the Softpedia labs using several industry-leading security solutions and found to be completely clean of adware/spyware components. We are impressed with the quality of your product and encourage you to keep these high standards in the future.
To assure our visitors that GeoPMS is clean, we have granted it with the "100% FREE" Softpedia award. Softpedia guarantees that GeoPMS 0.13 is 100% Free, which means it does not contain any form of malware, including but not limited to: spyware, viruses, trojans and backdoors. This software product was tested thoroughly and was found absolutely clean; therefore, it can be installed with no concern by any computer user. More information about your product's certification and the award is available on this page:
The new version 0.14 includes following Features:
* Store documents into the data-base (3 times of database binary-field size)
* Automated data-base update to new versions
* Error reporting in separated XML-document
I have solved some GUI errors and done some usability changes.
Important change: From this version 0.14 on only „store into the database“ is usable for windows. The local storage on windows machines will not be supported from me anymore.
EAGE Annual Conference & Exhibition 2009 in Amsterdam
From the 6. to the 11. June I will stay on the Annual Meeting in Amsterdam. You can visit me on the OCTIO booth or in the oral sessions. I am looking forward to meeting you on the conference.
I will present a talk on the Octio both with the title: "Sensor for Continuous Reservoir Monitoring - selection criteria" at 09.06.2009 / 16:00 - 16:30 and 11.06.2009 / 10:00 - 10:30.
Sensor+Test 2009 Conference & Exhibition 2009 in Nürnberg
Session Fiber-optic sensors
The first session was about fiber-optic sensors (FOS) and the focus was more on the source-side (e.g. laser-diodes) than the fiber-optic sensor itself. One company presented a FOS system for Airbus and in the discussion I asked them about the lifetime and reliability. Fortunately I started a very interesting discussion with my question which went further on the “Social-evening”. We were a group of 5 people discussing lifetime and reliability and my conclusion is:
Airbus has done long-life-tests (started 1998) with FBG and electronic strain-meter in parallel on a test-wing under very hard conditions. The failure-rate of FOS was remarkably lower than the electronic strain-meter. The lifetime and reliability should be very high and in the range of 25 years. The FBGs are not the problem, but I thinks the design of the accelerometers and hydrophones could cause less lifetime, reliability and noisy data.
Unfortunately the talk about an adaptive optoelectronic feedback system was canceled. But it looks like that the FOS feedback-system is a piezoelectric mirror which keeps the difference of the optical path length constant. The mirror is driven by the amplified voltage of the detector to provide the necessary feedback loop.
Session Capacitive Sensors / Micro Sensors
In this session the main focus was on the capacitive sensor improvement. Most of the presentations described solutions for typical problems (they are not well solved yet!): Environment noise, parasitic capacities / electro static discharge (e.g. moving hand above the sensor) and leakage resistor. The presented interfaces are able to solve one or more of the typical problems and the resolution reaches values down to aF!
Some talks presented solutions with capacitive-MEMS-hybrids to measure relative and absolute pressure from Pa to GPa dynamic range (25mV/MPa). They have a very low power consumption (less 0.01 mW), high temperature stability, better SNR as “classical pressure-devices” and less sensitivity for parasitic capacitor changes. Most of these products have reached pre-production state. The reliability and lifetime is comparable with CMOS technology and should fit into long lifetime requirements.
Session Magnetic Sensors
The last session of interest was focused on Magnetic sensors, namely Hall-Sensors, GMR-Sensors, Flux-Gate and SQUID-Sensors. Three of these sensors are quite interesting, because they measure the earth magnetic field: Hall, GMR and Flux-Gate.
Some talks presented 3D sensors and/or rotation and torque sensors and the solutions to improve sensor quality. The main problems are today: compensation of magnetization harmonics, cancelation of external magnetic fields (e.g. cross-talk) and offset-failure. For most of these problems are now solution available and some sensors will reach pre-production state this year.
EAGE Passive Seismic Workshop
The oil companies are interested to use reservoir monitoring systems for micro/passive seismic. But in the moment the costs are too high to install a permanent micro/passive seismic system, according to a “coffee-break-talk” with Shell. But if they can get this information in addition to active seismic arrays with a reservoir monitoring system then this looks like as an additional argument for the assessment-teams and decision-makers to install a permanent seismic monitoring array.
Most of the oil companies on the workshop believe in the usability of offshore measurements, but still solid data from large arrays doesn’t exist. Only simulations prove the usability of technique to use surface arrays (CGG, Shell and Chevron).
In addition to micro/passive seismic some tests have been started to use also permanent sources on the ocean floor and the results look quite promising. The technique for permanent sources uses piezo-electric devices like an inverted hydrophone. But I think the frequency is too high and the energy too low. The oil companies like to have a permanent reservoir monitoring system with micro/passive sensors including permanent active seismic sources on the ocean floor. The use of shooting-vessels then is only needed for calibration shots, but not for reservoir monitoring.
SEG Exhibition & Annual Meeting 2008 in Las Vegas
Besides my work on the OCTIO booth I visited different technical sessions. My foci were noise reduction, microseismicity and ocean bottom instrumentation. Noise reduction methods are usually done by software algorithms and there are some very interesting methods presented on the SEG. Eigenimage filter, Empirical Mode decomposition (EMD) and Polarization filtering for example.
On most of the technical oral sessions and workshops about ocean bottom instrumentation were talked about sensor quality and the increasing of the source energy, frequency-band and reflected amplitude. Multidepth marine source array, correction of sea surface wave heights and wide azimuth seismic for instance.
To use microseismic for hydraulic fracturing is a new technology and has been used since a couple year for well measurements. On the SEG started also a discussion about using microseismicity for permanent reservoir monitoring on the cean bottom. The problem with the microseismic source is that the micro fractures induced by pore-pressure-changes have very low amplitudes ( ca. M = -2). But I think that we can measure microseismicity with high accurate seismic accelerometer (see Publication: MEMS-Accelerometer calibration) and a good coupling to the ocean floor.
Two new Geores-Software-Projects by Sourceforge.net
is a simple paper management system to archive, search and manage scientific paper. It has full abstract search, search by author and by title. It uses a MySQL-DB.
is a project to check generic measured earth science data. It is also an interactive database management system to store all data in MySQL-DB. GDS rules as project based management system with unlimited stations. Each station manages 20 sensors. GDS uses simpleGDS, a XML-document, to import or export database projects.
Im Jahr 2005 erschien von Rainer Karlsch: "Hitlers Bombe. Die geheime Geschichte der deutschen Kernwaffenversuche." Das Buch löste eine heftige Kontroverse aus. Sind in Deutschland 1944/1945 nukleare Sprengsätze, bestehend aus viel Sprengstoff und nur kleinen Mengen an Spalt- und Fusionsstoffen getestet worden, und wie kann dies nachgewiesen werden?
Zur weiteren Klärung dieser und anderer Fragen äußern sich hier erstmals Kernwaffenspezialisten, Physiker, Historiker und Journalisten. Sie wägen ab, ob es im Deutschen Reich nukleare Tests gegeben haben könnte, analysieren die dazu vorliegenden Quellen und befassen sich mit den wissenschafts- und zeitgeschichtlichen Hintergründen. Das Spektrum der Beiträge reicht vom Sänger-Plan eines Angriffs auf New York, über Forschungen zur Minimierung der kritischen Masse, den Versuch der Berechnung des TNT-Äquivalents der deutschen Nukleartests bis zur Vorstellung wichtiger damaliger Akteure.
Mit Beiträgen von Reinhard Brandt, Wolfgang Ebsen, Gernot Eilers, Alexander Funtikov, Paul-J. Hahn, Rainer Karlsch, Marcus Landschulze, Vladimir Mineev, Günter Nagel, Heiko Petermann, Pawel Rodziewicz, Bernd Schulze.
2007, Cottbuser Studien zur Geschichte von Technik, Arbeit und Umwelt, Bd. 29, 352 Seiten, br., 29,90 EUR, ISBN 978-3-8309-1893-6
Es gibt viele Hinweise, dass aufgrund von Piezoeffekten, thermischen Prozessen, sowie durch Porenwasser im Gestein schon Tage vor einen Erdbeben bzw. Vulkanausbruch das elektromagnetische Feld der Erde sich verändert. Dies wäre eine Messgröße, die ein Fenster für ein Frühwarnsystem bieten würde. Das Projekt EMASS hat die Aufgabe, das elektromagnetische Feld systematisch zu untersuchen. Dazu gehört nach dem Bau und der Inbetriebnahme als erstes die Messung und Identifizierung von Störsignalen (Noise) durch eine Langzeitmessung. Anschließend wird die Antenne so erweitert, dass eine Lokalisierung von Beben möglich ist.
Unter dem Menüpunkt Bildergalerie gibt es eine Serie von Bildern mit Beschreibungen der verwendeten Geräte und Sensoren.