The intent of these processes is to establish database configurations for a new expedition.
If the off-going developers want to help and time permits--that's wonderful. It's a Catch-22 that probably indicates you need some rest.
Log into operating system oracle
user first; then conduct database user logins.
oracle@k1.ship.iodp.tamu.edu.
$
run the commands shown.Example of the above from Windows Terminal (wt.exe), Powershell, PuTTY, Hyperterminal or any other such tool.
ssh oracle@k1.ship.iodp.tamu.edu # Connect to an ODA with a terminal session. cd /backup/export # Export file and log will be placed here. . oraenv # Confirm which database instance is the default ORACLE_SID = [LIMSJR1] ? # LIMSJR1 is the expected default. Press <Enter> for no change. # If anything else shows, contact the DBA. # The default for K2 is expected to be SHIPTEST. sql me # Supply your DBA account in place of "me". # ! Quote your password if it contains spaces or symbol characters. # Example. Password:"@ s3cr3t fr45e" |
Run data cleaning and preening scripts
Connect to Oracle as TRANSFER (schema owner) or your DBA account. Doing this work in SQL Developer is often convenient.
set timing on -- GEODESC. Remove last expedition's GEODESC project. drop user geodp### cascade constraints; -- LIMS. Remove last expedition's LIMS content. -- Review TRANSFER.LOG entries to review these activities. call transfer.util_cleanout_transfer_new_tables(); -- (1) call transfer.box_asman_catalog_delete('EXP346'); -- (2) Last expedition call transfer.box_lime_audit_delete(###); -- (4) call transfer.box_lims_expedition_delete('346'); -- (3) Last expedition call transfer.box_lims_cleanup(); call transfer.box_lims_defrag(); -- (5) -- OPS. Remove last expedition's IRIS data. -- The timestamps below will surely not be correct. Review when the database export was conducted. Review the data based on a time range. delete from ops.ris_data where observed_time <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff'); delete from ops.ris_configuration where last_modified <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff'); delete from ops.ris_event where event_timestamp <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff'); |
The box_lims_expedition_delete()
step is the longest. Monitoring is helpful. Currently takes about 30 minutes--mostly to delete millions of results. Historically saw ranges of 2-8 hours (older hardware, fewer CPUs, slower disks).
If material from another expedition was analyzed on this expedition, it may be necessary to delete that content too, e.g. call box_lims_expedition_delete('344(312)');
VARIANCE: If multiple expeditions need to be cleaned, repeat the command for each expedition.
Run the command again if you see errors like this
Error report - ORA-20000: ORU-10027: buffer overflow, limit of 1000000 bytes ORA-06512: at "SYS.DBMS_OUTPUT", line 32 ... |
Alternatively "set serveroutput off". Then use SQL like that below to monitor progress of transfer.box_lims_expedition_delete()
from another terminal window or SQL Developer worksheet. Example
select x_expedition, count(*) from lims.sample group by x_expedition order by x_expedition; |
lims.x_lime_audit_main.audit_key
). The script as written deletes all audit_key
<= ### from lims.x_lime_audit_main
. The deletion cascades to linked lims.x_lime_audit_detail
.Expunge the associated depth maps, affines, and splices from LIMS.
There is presently no script for this process. The SCORS application will cancel affines and splices, but not expunge the records.
Use SQL Developer to remove content not relevant to the current expedition.
Participants for a new expedition have no need to see either menus or data from the previous expeditions database OVERVIEW.
Menus. OVERVIEW menus are stored in LIMS.LIST_ENTRY
. View the content where LIST = 'MENU'.
Change old expeditions for new expeditions. Drop menus that are not needed. The ODP and DSDP menus should always be retained.
Data clean. The summary content displayed by OVERVIEW lives in LIMS.X_DBOVERVIEW
. The table is indexed by expedition, site, and hole. It is sufficient to clean out specific content with a query like this (example)
|
See the MAD container load - process. And additional related documentation on the shore wiki. The names and masses of containers are delivered in a spreadsheet (or text file).
This is not done every expedition.