You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

The outgoing programmer is responsible for all of the end-of-expedition (EOX) activities.

The goal is to ensure a good copy of the data gets back to HQ.

Please read this entire document and its companion. Expect to take 3 to 4 hours to complete this exercise. (I keep saying this only takes 15 minutes. But that really is just the individual database dump processes.) Do not hurry. If you are "fried"--work together. 

0. Readiness assessment.


Confirm with science party and technical staff before proceeding.

  • Are we done collecting data for this laboratory area?
  • Has all the data content that you want to go home been:
      uploaded to the database
      copied to data1 OR
      copied to the EXPEDITION PROCEEDINGS volume
    ?
  • Where is the data? A data disposition form is available in s:\data1.
    It is helpful to the data librarian to have that filled out.

If no, obtain estimates when the data collection will be done. Schedule accordingly.   

1. Notify everyone. Database snapshot in progress.

  • Notify jr_developer and the shipboard technical staff that the EOX database backup is in progress.
  • The database is available for reports, but content cannot be changed while this activity is in progress.

2. Make the database read only.


Connect to the BUILD box. Execute all of the following steps from that system.


Run PuTTY.
Connect to oracle@k1.ship.iodp.tamu.edu
From the prompt run this command, supply the oracle user's credentials for k1

. oraenv
ORACLE_SID = [LIMSJR] ? LIMSJR
sqlplus your-name_dba
sql>
alter tablespace labware read only;


At the SQL> prompt, run the last command.
Stop. Leave this window open for use below.

Technical note. This command prevents write access to any table content that resides in the storage facility under the "labware" table space. In our database implementation, this happens to be all tables owned by the LIMS schema. Use of Drill Report ("ops" schema) is not affected. Aspects of DescLogik software configuration are also still open to maintenance, though use of DescLogik itself is locked out by this change.

3. Full snapshot as oracle@k1.


Run another PuTTY session.
Connect to oracle@k1.ship.iodp.tamu.edu
From the prompt run these commands--change the expedition and yourname  (will require logging in as a DBA User see PWSAFE) - - - JON Put Your Password in "QUOTES"

NOTE: While performing the EOX for expedition 384, it was discovered that the path represented by "data_pump_dir" did not exist.  You can discover what the data_pump_dir is supposed to be using this query (logged in as your DBA account):

select * from dba_directories;

The results have a lot of stuff that's not useful, but the DATA_PUMP_DIR should be there.  Currently it is /backup/LIMSJR/dpdump.  Note that "backup" is singular (it used to be plural) and there is apparently some desire to change this, so keep that in mind.  If you find that the folder does not exist, the command in this step will fail and you may need to get an MCS or a DBA on shore to help create it.

You can also try:

create directory data_pump_dir as '/backup/LIMSJR/dpdump';
grant read, write on directory data_pump_dir to transfer, system;

10/5/2021: Folder did not exist, as above, but MCS suggested I try creating it myself (in Putty, logged in as Oracle) using mkdir commands.  That initially seemed to work, but it broke some kind of symbolic or virtual link that caused the file to be written to the wrong physical location (with reduced disk space), so not recommended.

. oraenv
ORACLE_SID = [LIMSJR] ? LIMSJR
# Note that the folder you're cd-ing to here should be the "data_pump_dir" referenced in the next command.  This comes from the dba_directories table (see note above).
cd /backup/LIMSJR/dpdump
time expdp YOUR_DBA_ACCOUNT directory=data_pump_dir full=y logfile=limsjr_full-YOUR_EXP_NUMBER-export.log dumpfile=limsjr_full-YOUR_EXP_NUMBER-YOUR_INITIALS.dmpdp


Statistics. Typical durations for the last step from previous expeditions. 344S: 16 min 23 sec. 345: 13 min 54 sec. 346: 18 min 46 sec. 341S: 11 min 42 sec. 351: 12 min 9 sec; 360: 16 min 57 sec; 363: 3 min 21 sec; 367: 3 min 38 sec.

ODA for 3 expeditions of data (11.1 GiB): 4 min 17 sec.
396: 6 min 54 sec

4. Clear out transfer schema content

From the command prompt i[oracle@k1 ~]$
Run sqlplus, login as the TRANSFER USER

sqlplus transfer

Run these commands at the SQL> prompt. Use your expedition.

spool eox_YOUR_EXP_NUMBER.log
set serveroutput on
set timing on
set pagesize 0
set head on
alter session set nls_date_format='yyyy-mm-dd hh24:mi:ss';
call cleanout_new_tables();


Leave this session open for use below.
Statistics: 344S: < 1 sec. 345: < 1 sec.
396: <1 sec

5. Snapshot subset of database content to go home. (Using transfer id)


Run these scripts from the SQL> prompt. Use your expedition instead of "EXP351". The others are the same for every expedition.
Copy ASMAN data for current expedition to the transfer schema.
Copy DESC data to the transfer schema.

call asman_catalog_to_new('EXP351');
call asman_catalog_to_new('DROPS');
call asman_catalog_to_new('DESCINFO');
call descinfo_to_new();

NOTE: Generally, you only need to do this for your expedition and "QAQC".  The middle lines below are required only when material from a previous expedition is studied on the current expedition.
Copy LIMS data to the transfer schema. Use entries like the middle ones only when material from a previous expedition is studied on the current expedition.

call lims_expedition_to_new('351');
call lims_expedition_to_new('195');
call lims_expedition_to_new('125');
call lims_expedition_to_new('60');
call lims_expedition_to_new('QAQC');

Leave this session open. It is used again in the next step.

Statistics. 344S: ~20 min. 345: 10 min for all steps.
396: ~22 min.

6. Count the records backed up. Row-counts are expected to be identical between source tables and transfer tables. 

At the SQL> prompt, run these commands

NOTE: Generally, you only need your expedition number and "QAQC".  There may be others if your expedition included data and/or material from other expedition(s).

select x_expedition, count(*) from transfer.new_sample where x_expedition in ('351', '195', '125', '60', 'QAQC') group by x_expedition order by x_expedition;
select x_expedition, count(*) from lims.sample         where x_expedition in ('351', '195', '125', '60', 'QAQC') group by x_expedition order by x_expedition;
 
select count(*) from transfer.new_test where sample_number in (select sample_number from lims.sample where x_expedition in ('351', '195', '125', '60', 'QAQC'));
select count(*) from lims.test         where sample_number in (select sample_number from lims.sample where x_expedition in ('351', '195', '125', '60', 'QAQC'));
 
select count(*) from transfer.new_result where sample_number in (select sample_number from lims.sample where x_expedition in ('351', '195', '125', '60', 'QAQC'));
select count(*) from lims.result         where sample_number in (select sample_number from lims.sample where x_expedition in ('351', '195', '125', '60', 'QAQC'));
 
select catalog, count(*) from transfer.new_files group by catalog order by catalog;
select catalog, count(*) from lims.files         group by catalog order by catalog;
 
select count(*) from transfer.new_softwareconfig;
select count(*) from descinfo2.softwareconfig;
 
select count(*) from transfer.new_valuelists;
select count(*) from descinfo2.valuelists;
 
spool off
exit



Statistics. 344S: ~10 min to run the above steps.
396: ~10 min

To review the log, switch to the other terminal session.
Supply these commands.

cd /backup/LIMSJR/dpdump
less eox_YOUR_EXP_NUMBER.log

Press 'q' to quit reviewing the log, or Ctrl+C.
Leave the console window open for later use.
Place the EOX database snapshot log in the R:\AD\support current expedition EOX directory,
Example from a DOS prompt on the Build Box.

# Log into Novell.
r:
cd \AD\support
pscp oracle@k1.ship.iodp.tamu.edu:/backup/LIMSJR/dpdump/eox*.log .
# that is a SPACE PERIOD at the end of the above line
go to the Novell directory and create an expedition specific sub directory and move the log there.

7. Make a transfer-only snapshot as oracle@k1.

From the prompt [oracle@k1 ~]$

run these commands -using transfer logon

# Note that the folder referenced in the cd command is the DATA_PUMP_DIR (see note at the top of this page about that)
cd /backup/LIMSJR/dpdump
# this next command is a single line
time expdp transfer directory=data_pump_dir schemas=transfer logfile=limsjr_transfer-YOUR_EXP_NUMBER-export.log dumpfile=limsjr_transfer-YOUR_EXP_NUMBER-YOUR_INITIALS.dmpdp

Statistics. 344S: 3 min 13 sec. 345: 4 min 23 sec.

8. Test the export files. Did you extract what you expected?

This test verifies that some content can be read from the content exported above. Increases confidence that the exported content is good.
From the prompt [oracle@k1 ~]$
run these commands - using transfer logon

# this command is all one line with a space between the sections
time impdp transfer directory=data_pump_dir tables=lims.sample remap_tablespace=labware:transfer remap_schema=lims:transfer remap_table=sample:verify_full_sample exclude=GRANT exclude=INDEX exclude=CONSTRAINT exclude=STATISTICS exclude=TRIGGER dumpfile=limsjr_full-YOUR_EXP_NUMBER-YOUR_INITIALS.dmpdp
 
# this command is all one line
time impdp transfer directory=data_pump_dir tables=transfer.new_sample remap_table=new_sample:verify_xfer_sample exclude=INDEX exclude=CONSTRAINT exclude=STATISTICS dumpfile=limsjr_transfer-YOUR_EXP_NUMBER-YOUR_INITIALS.dmpdp


For both of these commands you should see a successful import of the content into the TRANSFER schema something like this

...
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "TRANSFER"."VERIFY_FULL_SAMPLE"              176.0 MB  560884 rows
Processing object type TABLE_EXPORT/TABLE/COMMENT
...

Login to the transfer schema. Look at tables verify_full_sampleverify_xfer_sample. They are expected to contain the expedition data you just exported.
Remove the verification tables.

drop table transfer.verify_full_sample purge;
drop table transfer.verify_xfer_sample purge;

Statistics. 344S: 3 min 25 sec. 345: 3 min 43 sec. 351: full 1 min 25 sec; xfer 24 sec. For the integrity check on each export.

9. Restore database read/write capability.

From the prompt [oracle@k1 ~]$

run these command, supply the appropriate credential

sqlplus YOUR_DBA_ACCOUNT
sql>
alter tablespace labware read write;
exit

Stop. Leave this window open for use below.

10. Compress the exports for transport. Cleanup. 

From the prompt [oracle@k1 ~]$
run these commands

cd /backup/LIMSJR/dpdump
time bzip2 --best limsjr*YOUR_EXP_NUMBER*.dmpdp
# once the above completes, then run this
ls -l
md5sum *.bz2

Cut and paste the digest keys and file listing into a notification email. Enables remote users to verify the integrity of the files.

           Please log these in a txt file in the \AD\support\'expedition named sub directory'\ see other expeditions for example.

Statistics. 367: 22 min to compress both files (~5 GiB to < 1 GiB); 345: 22 min 53 sec for bzip step. 351: full 137 min 46 for ~8 GiB content.  360: 67 min 27 sec for ~5 GiB file. 45-60 min for high recovery expeditions.

Exp 396: we had trouble initially because we ran out of disk space.  This was apparently due to my creating the /backup/LIMSJR/dpdump folder under the Oracle account.  This broke some kind of link-up that gives the backup folder lots of additional drive space.  Don't do that.

11. Notify everyone. Database snapshot process is complete.

 Notify jr_developer and expedition technical staff that the process is done.

Speak with the MCS. They will pick up the data from here.

k1.ship.iodp.tamu.edu:/backup/LIMSJR/dpdump
    limsjr*.bz2
    limsjr*.log


The MCS will copy the files over to oem.ship.iodp.tamu.edu:/media/backup/export--one of several staging areas for the EOX tape run.

Inquire with the MCS. Ensure the (above) database content and the (below) ASMAN content are being taken to media for transport to HQ.

12. Send the transfer content home. 

NOTE:   

Please transfer only the limjr_transfer file online. The full file can come on tape.

Rakesh


This is now routine. The Database Group expects this to occur. This method makes the data available a week earlier than can be achieved transporting tapes.



Two transport methods are available. Apply the one that suits you best.

  • Hand the files over to the MCS, requesting that they be transferred to \\NORTH\FTP\FromShip\EOX\  << Preferred.


When the files have been transferred, notify  Sunil Amarnani (sunil@rajaconsultancy.com); ITOPS <it_ops@iodp.tamu.edu>; Gates, Philip C <gates@iodp.tamu.edu>; Rakesh Mithal <mithal@iodp.tamu.edu>; IODP - programmers <programmers@iodp.tamu.edu>.

  • No labels