Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The on-coming developers are responsible for these processes!

The intent of these processes is to establish database configurations for a new expedition.

If the off-going developers want to help and time permits--that's wonderful. It's a Catch-22 that probably indicates you need some rest.

Connection Notes

Log into operating system oracle user first; then conduct database user logins.

  • Open a SQL Developer session.
  • Connect to oracle@k1.ship.iodp.tamu.edu.
  • From the prompt k1run the commands shown.
  • Supply the appropriate credentials.

Example of the above from Windows Terminal (wt.exe), Powershell, PuTTY, Hyperterminal or any other such tool.

Code Block
ssh oracle@k1.ship.iodp.tamu.edu    # Connect to an ODA with a terminal session.
cd /backup/export                   # Export file and log will be placed here.
. oraenv                            # Confirm which database instance is the default
ORACLE_SID = [LIMSJR1] ?            # LIMSJR1 is the expected default. Press <Enter> for no change.
                                    # If anything else shows, contact the DBA.
                                    # The default for K2 is expected to be SHIPTEST.
sql me                              # Supply your DBA account in place of "me".
                                    # ! Quote your password if it contains spaces or symbol characters.
                                    # Example. Password:"@ s3cr3t fr45e"


Run data cleaning and preening scripts

Connect to Oracle as TRANSFER (schema owner) or your DBA account. Doing this work in SQL Developer is often convenient.

  • Make appropriate substitutions for current expeditions.
  • If you are new to the process, read the footnotes. Ask.


Code Block
set timing on
-- GEODESC. Remove last expedition's GEODESC project.
drop user geodp### cascade;

-- LIMS. Remove last expedition's LIMS content.
-- Review TRANSFER.LOG entries to review these activities.
call transfer.util_cleanout_transfer_new_tables(); -- (1)
call transfer.box_asman_catalog_delete('EXP346'); -- (2) Last expedition
call transfer.box_lime_audit_delete(###);    -- (4)
call transfer.box_lims_expedition_delete('346');  -- (3) Last expedition
call transfer.box_lims_cleanup();
call transfer.box_lims_defrag(); -- (5)

-- OPS. Remove last expedition's IRIS data.
-- The timestamps below will surely not be correct. Review when the database export was conducted. Review the data based on a time range.
delete from ops.ris_data where observed_time <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff');
delete from ops.ris_configuration where last_modified <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff');
delete from ops.ris_event where event_timestamp <= to_timestamp('2023-08-11 13:00:00.0', 'yyyy-mm-dd hh24:mi:ssxff');


The box_lims_expedition_delete() step is the longest. Monitoring is helpful. Currently takes about 30 minutes--mostly to delete millions of results. Historically saw ranges of 2-8 hours (older hardware, fewer CPUs, slower disks).

If material from another expedition was analyzed on this expedition, it may be necessary to delete that content too, e.g. call box_lims_expedition_delete('344(312)');

  1. VARIANCE: If you need to keep some data from the previous expedition for the next expedition, skip this step. Even if you forget, you can still copy what you need from TRANSFER.NEW_tables without having to resort to the data pump import tool.
  2. VARIANCE: If you need to keep some data from the previous expedition for reference on the next expedition, then custom SQL is required to preserve what you need. And you must communicate with the MCS to ensure what you need is preserved on the ASMAN file system--their routine is to expunge the previous expedition.
  3. VARIANCE: If multiple expeditions need to be cleaned, repeat the command for each expedition.
    Run the command again if you see errors like this

    Code Block
    Error report -
    ORA-20000: ORU-10027: buffer overflow, limit of 1000000 bytes
    ORA-06512: at "SYS.DBMS_OUTPUT", line 32
    ...

    The above error should no longer be occurring. Use of DBMS_OUTPUT has been replaced with inserts to TRANSFER.LOG for all TRANSFER procedures and some LIMS functions and procedures.

    Alternatively "set serveroutput off". Then use SQL like that below to monitor progress of transfer.box_lims_expedition_delete() from another terminal window or SQL Developer worksheet. Example

    Code Block
    select x_expedition, count(*) from lims.sample group by x_expedition order by x_expedition;


  4. Look up and specify ### (lims.x_lime_audit_main.audit_key). The script as written deletes all audit_key <= ### from lims.x_lime_audit_main. The deletion cascades to linked lims.x_lime_audit_detail.
  5. Defragmentation of the space used by SAMPLE, TEST, RESULT is recommended, but not required if you are pressed for time. Defragmentation on 398 took an hour to run with current ODAs and configs. Older ODAs and configs experienced 9-14 hours.

2/13/2024: Removed constraints  from the end of the drop user geodp###  command. Was causing the script to fail.

Remove SCALES not relevant to current shipboard work

Expunge the associated depth maps, affines, and splices from LIMS.

There is presently no script for this process. The SCORS application will cancel affines and splices, but not expunge the records.

The BOX_LIMS_EXPEDITION_DELETE() procedure contains a superset of this process, but must usually be copied and customized for specific need. Review the code.

Use SQL Developer to remove content not relevant to the current expedition.


Info

When removing records from LIMS.X_SAMPLE_DEPTH, do not remove any records with a scale_id=11311 or 11331. Some of these records are needed for 999 and QAQC samples. If you already have, the script below will restore them.

Removal of X_SAMPLE_DEPTH records for specific scales and expeditions may be accomplished like this:

-- Review scales - expect consistency among these lists.
select expedition, scale_id from lims.scors_affine where expedition in ('###', ...);
select expedition, scale_id from lims.scors_splice where expedition in ('###',...);
select distinct s.x_expedition, xhts.scale_id from lims.sample s, lims.x_hole_to_scale xhts where s.sample_number=xhts.hole_number and s.x_expedition in ('###',...);

-- Selectively delete X_SAMPLE_DEPTH records by expedition and scale.
delete from lims.x_sample_depth where x_scale in (#,...) and exists (select 1 from lims.sample where sample_number=x_sample_depth.sample_number and x_expedition in ('###',...) );


Code Block
languagesql
collapsetrue
INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT)
    Select s.SAMPLE_NUMBER, '11311' AS SCALE_ID, 0 AS TOP, 0 AS BOT
    from lims.sample s
        LEFT OUTER JOIN lims.x_sample_depth sd
            ON sd.sample_number = s.sample_number
                and sd.scale_id = '11311'
    where s.x_expedition = '999'
--        and s.x_site = 'U9999'
--        and s.x_hole = 'A'
--        and s.x_core = '1'
        and sd.sample_number is null;
        
INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT)
    Select s.SAMPLE_NUMBER, '11331' AS SCALE_ID, 0 AS TOP, 0 AS BOT
    from lims.sample s
        LEFT OUTER JOIN lims.x_sample_depth sd
            ON sd.sample_number = s.sample_number
                and sd.scale_id = '11331'
    where s.x_expedition = '999'
--        and s.x_site = 'U9999'
--        and s.x_hole = 'A'
--        and s.x_core = '1'
        and sd.sample_number is null;

INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT)
    Select s.SAMPLE_NUMBER, '11311' AS SCALE_ID, 0 AS TOP, 0 AS BOT
    from lims.sample s
        LEFT OUTER JOIN lims.x_sample_depth sd
            ON sd.sample_number = s.sample_number
                and sd.scale_id = '11311'
    where s.x_expedition = 'QAQC'
--        and s.x_site = 'U9999'
--        and s.x_hole = 'A'
--        and s.x_core = '1'
        and sd.sample_number is null;
        
INSERT INTO lims.x_sample_depth (SAMPLE_NUMBER, SCALE_ID, TOP, BOT)
    Select s.SAMPLE_NUMBER, '11331' AS SCALE_ID, 0 AS TOP, 0 AS BOT
    from lims.sample s
        LEFT OUTER JOIN lims.x_sample_depth sd
            ON sd.sample_number = s.sample_number
                and sd.scale_id = '11331'
    where s.x_expedition = 'QAQC'
--        and s.x_site = 'U9999'
--        and s.x_hole = 'A'
--        and s.x_core = '1'
        and sd.sample_number is null;


Cleaning OVERVIEW Menu and Summary Data

Participants for a new expedition have no need to see either menus or data from the previous expeditions database OVERVIEW.
Menus. OVERVIEW menus are stored in LIMS.LIST_ENTRY. View the content where LIST = 'MENU'. Change old expeditions for new expeditions. Drop menus that are not needed. The ODP and DSDP menus should always be retained.
Data clean. The summary content displayed by OVERVIEW lives in LIMS.X_DBOVERVIEW. The table is indexed by expedition, site, and hole. It is sufficient to clean out specific content with a query like this (example)

delete from lims.x_dboverview where expedition in ('360', '361');
commit;

Containers to be loaded for MAD analyses

See the MAD container load - process. And additional related documentation on the shore wiki. The names and masses of containers are delivered in a spreadsheet (or text file).

This is not done every expedition.