Retrieving Data from BLOB into a file

February 05, 2004 Ambarish -- Thanks for the question regarding "Retrieving Data from BLOB into a file.", version 8.1.7
You Asked
Hi Tom,

We have inserted the BLOB field into the database .The file contents have been 
properly inserted into the database.

I am not able to retrieve the BLOB into a file from the database.The requirement 
is to put the BLOB into a file.

Thanks a lot for your co-operation.

Regards,
Ambarish. 

and we said...
Prior to Oracle9iR2 you will need to use Java, C, VB, some 3gl language.

In 9iR2 -- PLSQL can write binary files using UTL_FILE.

In 9iR1 and before, you will need to use Java or some other 3gl that has the 
ability to do this.

If you have my book "Expert one on one Oracle" -- i do have an example in there 
in Pro*C that writes BLOBs to files in an external procedure -- so it works like 
a stored procedure call. 
Review & Followup

Rating: 3
Hope this helps  November 08, 2002
Reviewer:  Mio  from US

Here's a little java client utility that I use for 8.1.7. Look at the usage 
string. query_file should contain a query that returns a single lob locator. By 
default it reads from Oracle and writes to lob_file. Pass -write to read from 
lob_file and write to Oracle. By default it assumes a clob, pass -blob for 
binary data.

// file LobUtils.java
import java.io.InputStream;
import java.io.OutputStream;
import java.io.IOException;
import java.io.FileInputStream;
import java.io.FileOutputStream;

import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.Blob;
import oracle.sql.BLOB;
import java.sql.SQLException;

import java.sql.Clob;
import oracle.sql.CLOB;

import java.io.FileReader;
import java.io.BufferedReader;
import java.io.StringWriter;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.Writer;
import java.io.InputStreamReader;

public class LobUtils 
{
    final static int bBufLen = 4 * 8192;
    String query;
    String connectString;
    String outFile;
    Connection conn;

    public LobUtils(String connectString, String query, String outFile) {
        this.connectString = connectString;
        this.query = query;
        this.outFile = outFile;
        this.conn = null;
    }

    public static void main(String[] args)
      throws FileNotFoundException, IOException, SQLException {
        if (args.length < 5) usage();
        int ii = 0;
        String connString = args[ii++];
        String queryFile = null;
        String outputFile = null;

        boolean read = true;
        boolean isBinary = false;

        for (; ii < args.length; ii++) {
            if (args[ii].equals("-write"))
                read = false;

            if (args[ii].equals("-blob"))
                isBinary = true;

            if (args[ii].equals("-qf") && ii < args.length - 1)
                queryFile = args[++ii];

            if (args[ii].equals("-lf") && ii < args.length - 1)
                outputFile = args[++ii];

        }

        if (queryFile == null || outputFile == null) usage();

            // all set
        if (read) {
            BufferedReader freader = new BufferedReader(new 
FileReader(queryFile));
            StringWriter swriter = new StringWriter();
            int bufLen = 1024;
            char[] cbuf = new char[bufLen];
            int length = -1;
            while ((length = freader.read(cbuf, 0, bufLen)) != -1) {
                swriter.write(cbuf, 0, length);
            }
            freader.close();
            swriter.close();
            String query = swriter.toString();

            LobUtils lutils = new LobUtils(connString, query, outputFile);
            if (isBinary) {
                Blob blob = lutils.getBlob();
                long wrote = lutils.writeBlobToFile(blob);
                System.out.println("Wrote " + wrote + " bytes to file " + 
outputFile);
            } else {
                Clob clob = lutils.getClob();
                long wrote = lutils.writeClobToFile(clob);
                System.out.println("Wrote " + wrote + " bytes to file " + 
outputFile);
            }
        } else {
            BufferedReader freader = new BufferedReader(new 
FileReader(queryFile));
            StringWriter swriter = new StringWriter();
            int bufLen = 1024;
            char[] cbuf = new char[bufLen];
            int length = -1;
            while ((length = freader.read(cbuf, 0, bufLen)) != -1) {
                swriter.write(cbuf, 0, length);
            }
            freader.close();
            swriter.close();
            String query = swriter.toString();

            LobUtils lutils = new LobUtils(connString, query, outputFile);
            Clob clob = lutils.getClob();
            InputStream creader = new FileInputStream(outputFile);
            long wrote = lutils.writeToOraClob(clob, creader);
            System.out.println("Wrote " + wrote + " bytes from file " + 
outputFile);
        }
    }

    public Clob getClob()
      throws SQLException {
        conn = ConnUtil.getConnection(connectString);
        conn.setAutoCommit(false);
        Statement stmt = conn.createStatement();
        ResultSet rs = stmt.executeQuery(query);
        Clob clob = null;
        if (rs.next()) {
            clob = rs.getClob(1);
        }
        return clob;
    }

    public Blob getBlob()
      throws SQLException {
        conn = ConnUtil.getConnection(connectString);
        conn.setAutoCommit(false);
        Statement stmt = conn.createStatement();
        ResultSet rs = stmt.executeQuery(query);
        Blob blob = null;
        if (rs.next()) {
            blob = rs.getBlob(1);
        }
        return blob;
    }

    public long writeClobToFile(Clob clob)
      throws IOException, SQLException {
        long wrote = 0;
        BufferedWriter fwriter = new BufferedWriter(new FileWriter(outFile));
        wrote = readFromClob(clob, fwriter);
        fwriter.close();
        conn.commit();
        conn.close();
        return wrote;
    }

    public long writeBlobToFile(Blob blob)
      throws IOException, SQLException {
        long wrote = 0;
        OutputStream fwriter = new FileOutputStream(outFile);
        wrote = readFromBlob(blob, fwriter);
        fwriter.close();
        conn.commit();
        conn.close();
        return wrote;
    }

    private static void usage() {
        System.err.println("Usage: java LobUtils user/passwd@sid [-write] 
[-blob] -qf query_file -lf lob_file");
        System.exit(1);
    }

    public static long writeToOraBlob(Blob blob, InputStream in)
      throws SQLException, IOException {
        BLOB oblob = (BLOB)blob;
        OutputStream out = oblob.getBinaryOutputStream();
        int length = -1;
        long wrote = 0;
        int chunkSize = oblob.getChunkSize();
        byte[] buf = new byte[chunkSize];
        while ((length = in.read(buf)) != -1) {
            out.write(buf, 0, length);
            wrote += length;
        }
        out.close();
        return wrote;
    }

    public long writeToOraClob(Clob clob, InputStream in)
      throws SQLException, IOException {
        CLOB oclob = (CLOB)clob;
        OutputStream out = oclob.getAsciiOutputStream();
        int length = -1;
        long wrote = 0;
        int chunkSize = oclob.getChunkSize();
        byte[] buf = new byte[chunkSize];
        while ((length = in.read(buf)) != -1) {
            out.write(buf, 0, length);
            wrote += length;
        }
        out.close();
        conn.commit();
        return wrote;
    }

    public static long readFromBlob(Blob blob, OutputStream out)
      throws SQLException, IOException {
        InputStream in = blob.getBinaryStream();
        int length = -1;
        long read = 0;
        byte[] buf = new byte[bBufLen];
        while ((length = in.read(buf)) != -1) {
            out.write(buf, 0, length);
            read += length;
        }
        in.close();
        return read;
    }

    public static long readFromClob(Clob clob, Writer out)
      throws SQLException, IOException {
        BufferedReader in = new BufferedReader(new 
InputStreamReader(clob.getAsciiStream()));
        int length = -1;
        long read = 0;
        char[] buf = new char[bBufLen];
        while ((length = in.read(buf, 0, bBufLen)) != -1) {
            out.write(buf, 0, length);
            read += length;
        }
        in.close();
        return read;
    }
}
 


Rating: 3
Retrieving Data from BLOB into a file  November 08, 2002
Reviewer:  Ambarish Ganguly  from Shippensburg,PA,USA

Isnt it a bit surprising that using Oracle packages , I can successfully write 
to a BLOB from a file, whereas we cannot read the BLOB into a file ?
I am using VB, thereby wanted to do all processing on the server side , hence 
tried to have a solution . 

Followup:
9iR2 -- you can -- it is easy.

8i -- you can, java can do this easily if you like.
 


Rating: 3
Retrieving Data from BLOB into a file... using UTL_FILE  November 11, 2002
Reviewer:  Fred Howell  from Merrimack, NH USA

I am currently searching through the Oracle documentation for UTL_FILE looking 
for a proc that will allow me to write BLOBs to a file. Unfortuately i don't see 
a proc that will work. The BLOBs we are working with are scanned images not 
text. Is there a UTL_FILE.PUT_BLOB ( that i tried - what about an equivalent or 
alternative). I have also considered the Java alternative but would prefer a 
PL/SQL solution as that is where the group expertise is for support and 
maintenance. Any assistance will be greatly appreciated. thanks, Fred 

Followup:
Until 9iR2 -- as stated -- java or c or some other 3gl is the only solution. 


Rating: 4
do you have an example for 9iR2?  February 10, 2003
Reviewer:  gs

Tom,

Do you by any chance have an example on how to get this done in 9iR2?

Thanks 

Followup:
it is just utl_file.put_raw.  you'll have a blob, you'll dbms_lob.read or 
dbms_lob.substr or whatever part of the blob, and utl_file.put_raw it. 


Rating: 4
How do I get around the 32K limit, if using utl_file?  February 11, 2003
Reviewer:  gs.

Thanks Tom for you response.

This is what i was trying and was getting the following error, if the length 
exceeds 32K.

declare
*
ERROR at line 1:
ORA-29285: file write error
ORA-06512: at "SYS.UTL_FILE", line 18
ORA-06512: at "SYS.UTL_FILE", line 375
ORA-06512: at "SYS.UTL_FILE", line 990
ORA-06512: at line 35

declare
    i1 blob;
    i2 number;
    amt number := 10000;
    len number;
    my_vr raw(10000);
        l_output utl_file.file_type;
    p_dir varchar2(30) default 'TEST_DIR';
    p_file varchar2(30) default 'gs_test1';
begin
-- get the blob locator  
    l_output := utl_file.fopen(p_dir, p_file, 'w', 32760);
    for l_cur in (SELECT record#, xsl mylob FROM docxsl where rownum = 1 order 
by record#)
    loop
        len :=  DBMS_LOB.GETLENGTH(l_cur.mylob);
        i1 := l_cur.mylob ;
        dbms_output.put_line('Length of the Column :  ' || to_char(len));
        i2 := 1;
        if len < 10000 then
            DBMS_LOB.READ(i1,len,i2,my_vr);         
            utl_file.put(l_output, my_vr );
        else
            DBMS_LOB.READ(i1,amt,i2,my_vr);
            utl_file.put(l_output, UTL_RAW.CAST_TO_VARCHAR2(my_vr) );
        end if;  

        i2 := i2 + amt;  
        while (i2 < len) loop   -- loop till entire data is fetched    
        dbms_output.put_line('i2 : ' || to_char(i2));
            DBMS_LOB.READ(i1,amt,i2,my_vr);     
            utl_file.put(l_output, UTL_RAW.CAST_TO_VARCHAR2(my_vr) );
            utl_file.fflush(l_output);
            i2 := i2 + amt ; 
        end loop;  
            utl_file.fclose(l_output);
    end loop;
end ;
/

Note: I combine the example from your site and metalink.

I have XSL files stored as BLOB in the DB. I wanted to retrieve them, the way 
was they were stored. I don't want to put a "new line" after every 32K. Can I do 
this using PL/SQL?

--I thought it is related to the question asked, hence I am not submitting a new 
one.

Thanks very much. 

Followup:
If they are XSL why are they blobs?  

but in any case, utl file has a maximum of 32k -- an unavoidable, no workaround, 
hard limit. 


Rating: 5
No workaround!!!!  March 17, 2003
Reviewer:  Steve  from UK

I trying to write a dbf file straight out of the database.  Right after the 
header I then start to output the rows by looping through them thus:

for i in 1..vRow.count loop
    
    utl_file.put_raw(v_outputfile,vRow(i),TRUE);
        
end loop;

Now, this is fine for a small table but for anything bigger I get the write file 
error - ora-29285.  Is there really no way around this?  Surely there must be! 

Followup:
got an example, use all_objects and see if it reproduces -- if so, post code 
here and I'll look at it. 


Rating: 5
You'll recognise most of this!  March 17, 2003
Reviewer:  Steve  from UK

The code is an extended version of your dbf file extractor and it also uses your 
str2tbl function!  Not complete yet as it can't handle dates properly or 
decimals, but it demonstrates the problem:

CREATE OR REPLACE package dbase_pkg
as

    -- procedure to a load a table with records
    -- from a DBASE file.
    --
    -- Uses a BFILE to read binary data and dbms_sql
    -- to dynamically insert into any table you
    -- have insert on.
    --
    -- p_dir is the name of an ORACLE Directory Object
    --       that was created via the CREATE DIRECTORY
    --       command
    --
    -- p_file is the name of a file in that directory
    --        will be the name of the DBASE file
    --
    -- p_tname is the name of the table to load from
    --
    -- p_cnames is an optional list of comma separated
    --          column names.  If not supplied, this pkg
    --          assumes the column names in the DBASE file
    --          are the same as the column names in the
    --          table
    --
    -- p_show boolean that if TRUE will cause us to just
    --        PRINT (and not insert) what we find in the
    --        DBASE files (not the data, just the info
    --        from the dbase headers....)

    procedure load_Table( p_dir    in varchar2,
                          p_file   in varchar2,
                          p_tname  in varchar2,
                          p_cnames in varchar2 default NULL,
                          p_show   in BOOLEAN default FALSE);
                          
    -- procedure to dump a file to disk in DBASE file format.
    --
    -- in addition to the above, p_where_clause is the filter 
    -- which if null will dump the entire table.  
                          
    procedure dump_table (p_dir          in varchar2,
                            p_file         in varchar2,
                          p_tname        in varchar2,
                          p_cnames       in varchar2 default NULL,
                          p_where_clause in varchar2 default ' 1=1 ');
end;
/
CREATE OR REPLACE package body dbase_pkg
as

-- Might have to change depending on platform!!!
-- Controls the byte order of binary integers read in
-- from the dbase file
BIG_ENDIAN      constant boolean default TRUE;

type dbf_header is RECORD
(
    version    varchar2(25), -- dBASE version number
    year       int,          -- 1 byte int year, add to 1900
    month      int,          -- 1 byte month
    day        int,             -- 1 byte day
    no_records VARCHAR2(8),             -- number of records in file,
                             -- 4 byte int
    hdr_len    VARCHAR2(4),             -- length of header, 2 byte int
    rec_len    VARCHAR2(4),             -- number of bytes in record,
                             -- 2 byte int
    no_fields  int           -- number of fields
);


type field_descriptor is RECORD
(
    name     varchar2(11),
    fname    varchar2(30),
    type     char(1),
    length   int,   -- 1 byte length
    decimals int    -- 1 byte scale
);

type field_descriptor_array
is table of
field_descriptor index by binary_integer;


type rowArray
is table of
varchar2(4000) index by binary_integer;


g_cursor binary_integer default dbms_sql.open_cursor;


function ite( tf in boolean, yes in varchar2, no in varchar2 )
        return varchar2
is
begin
    if ( tf ) then
       return yes;
    else
       return no;
    end if;
end ite;




-- Function to convert a binary unsigned integer
-- into a PLSQL number

function to_int( p_data in varchar2 ) return number
is
    l_number number default 0;
    l_bytes  number default length(p_data);
begin
    if (big_endian)
    then
        for i in 1 .. l_bytes loop
            l_number := l_number +
                              ascii(substr(p_data,i,1)) *
                                           power(2,8*(i-1));
        end loop;
    else
        for i in 1 .. l_bytes loop
            l_number := l_number +
                         ascii(substr(p_data,l_bytes-i+1,1)) *
                         power(2,8*(i-1));
        end loop;
    end if;

    return l_number;
end;


-- Routine to parse the DBASE header record, can get
-- all of the details of the contents of a dbase file from
-- this header

procedure get_header
(p_bfile        in bfile,
 p_bfile_offset in out NUMBER,
 p_hdr          in out dbf_header,
 p_flds         in out field_descriptor_array )
is
    l_data            varchar2(100);
    l_hdr_size        number default 32;
    l_field_desc_size number default 32;
    l_flds            field_descriptor_array;
begin
    p_flds := l_flds;

    l_data := utl_raw.cast_to_varchar2(
                       dbms_lob.substr( p_bfile,
                                        l_hdr_size,
                                        p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + l_hdr_size;

    p_hdr.version    := ascii( substr( l_data, 1, 1 ) );
    p_hdr.year       := 1900 + ascii( substr( l_data, 2, 1 ) );
    p_hdr.month      := ascii( substr( l_data, 3, 1 ) );
    p_hdr.day        := ascii( substr( l_data, 4, 1 ) );
    p_hdr.no_records := to_int( substr( l_data,  5, 4 ) );
    p_hdr.hdr_len    := to_int( substr( l_data,  9, 2 ) );
    p_hdr.rec_len    := to_int( substr( l_data, 11, 2 ) );
    p_hdr.no_fields  := trunc( (p_hdr.hdr_len - l_hdr_size)/
                                           l_field_desc_size );


    for i in 1 .. p_hdr.no_fields
    loop
        l_data := utl_raw.cast_to_varchar2(
                         dbms_lob.substr( p_bfile,
                                          l_field_desc_size,
                                          p_bfile_offset ));
        p_bfile_offset := p_bfile_offset + l_field_desc_size;

        p_flds(i).name := rtrim(substr(l_data,1,11),chr(0));
        p_flds(i).type := substr( l_data, 12, 1 );
        p_flds(i).length  := ascii( substr( l_data, 17, 1 ) );
        p_flds(i).decimals := ascii(substr(l_data,18,1) );
    end loop;

    p_bfile_offset := p_bfile_offset +
                          mod( p_hdr.hdr_len - l_hdr_size,
                               l_field_desc_size );
end;


function build_insert
( p_tname in varchar2,
  p_cnames in varchar2,
  p_flds in field_descriptor_array ) return varchar2
is
    l_insert_statement long;
begin
    l_insert_statement := 'insert into ' || p_tname || '(';
    if ( p_cnames is NOT NULL )
    then
        l_insert_statement := l_insert_statement ||
                              p_cnames || ') values (';
    else
        for i in 1 .. p_flds.count
        loop
            if ( i <> 1 )
            then
               l_insert_statement := l_insert_statement||',';
            end if;
            l_insert_statement := l_insert_statement ||
                            '"'||  p_flds(i).name || '"';
        end loop;
        l_insert_statement := l_insert_statement ||
                                           ') values (';
    end if;
    for i in 1 .. p_flds.count
    loop
        if ( i <> 1 )
        then
           l_insert_statement := l_insert_statement || ',';
        end if;
        if ( p_flds(i).type = 'D' )
        then

            l_insert_statement := l_insert_statement ||
                     'to_date(:bv' || i || ',''yyyymmdd'' )';
        else
            l_insert_statement := l_insert_statement ||
                                                ':bv' || i;
        end if;
    end loop;
    l_insert_statement := l_insert_statement || ')';

    return l_insert_statement;
end;


function get_row
( p_bfile in bfile,
  p_bfile_offset in out number,
  p_hdr in dbf_header,
  p_flds in field_descriptor_array ) return rowArray
is
    l_data     varchar2(4000);
    l_row   rowArray;
    l_n     number default 2;
begin
    l_data := utl_raw.cast_to_varchar2(
                   dbms_lob.substr( p_bfile,
                                    p_hdr.rec_len,
                                    p_bfile_offset ) );
    p_bfile_offset := p_bfile_offset + p_hdr.rec_len;

    l_row(0) := substr( l_data, 1, 1 );

    for i in 1 .. p_hdr.no_fields loop
        l_row(i) := rtrim(ltrim(substr( l_data,
                                        l_n,
                                        p_flds(i).length ) ));
        if ( p_flds(i).type = 'F' and l_row(i) = '.' )
        then
            l_row(i) := NULL;
        end if;
        l_n := l_n + p_flds(i).length;
    end loop;
    return l_row;
end get_row;


procedure show( p_hdr    in dbf_header,
                p_flds   in field_descriptor_array,
                p_tname  in varchar2,
                p_cnames in varchar2,
                p_bfile  in bfile )
is
    l_sep varchar2(1) default ',';

    procedure p(p_str in varchar2)
    is
        l_str long default p_str;
    begin
        while( l_str is not null )
        loop
            dbms_output.put_line( substr(l_str,1,250) );
            l_str := substr( l_str, 251 );
        end loop;
    end;
begin
    p( 'Sizeof DBASE File: ' || dbms_lob.getlength(p_bfile) );

    p( 'DBASE Header Information: ' );
    p( chr(9)||'Version = ' || p_hdr.version );
    p( chr(9)||'Year    = ' || p_hdr.year   );
    p( chr(9)||'Month   = ' || p_hdr.month   );
    p( chr(9)||'Day     = ' || p_hdr.day   );
    p( chr(9)||'#Recs   = ' || p_hdr.no_records);
    p( chr(9)||'Hdr Len = ' || p_hdr.hdr_len  );
    p( chr(9)||'Rec Len = ' || p_hdr.rec_len  );
    p( chr(9)||'#Fields = ' || p_hdr.no_fields );

    p( chr(10)||'Data Fields:' );
    for i in 1 .. p_hdr.no_fields
    loop
        p( 'Field(' || i || ') '
             || 'Name = "' || p_flds(i).name || '", '
             || 'Type = ' || p_flds(i).Type || ', '
             || 'Len  = ' || p_flds(i).length || ', '
             || 'Scale= ' || p_flds(i).decimals );
    end loop;

    p( chr(10) || 'Insert We would use:' );
    p( build_insert( p_tname, p_cnames, p_flds ) );

    p( chr(10) || 'Table that could be created to hold data:');
    p( 'create table ' || p_tname );
    p( '(' );

    for i in 1 .. p_hdr.no_fields
    loop
        if ( i = p_hdr.no_fields ) then l_sep := ')'; end if;
        dbms_output.put
        ( chr(9) || '"' || p_flds(i).name || '"   ');

        if ( p_flds(i).type = 'D' ) then
            p( 'date' || l_sep );
        elsif ( p_flds(i).type = 'F' ) then
            p( 'float' || l_sep );
        elsif ( p_flds(i).type = 'N' ) then
            if ( p_flds(i).decimals > 0 )
            then
                p( 'number('||p_flds(i).length||','||
                              p_flds(i).decimals || ')' ||
                              l_sep );
            else
                p( 'number('||p_flds(i).length||')'||l_sep );
            end if;
        else
            p( 'varchar2(' || p_flds(i).length || ')'||l_sep);
        end if;
    end loop;
    p( '/' );
end;


procedure load_Table( p_dir in varchar2,
                      p_file in varchar2,
                      p_tname in varchar2,
                      p_cnames in varchar2 default NULL,
                      p_show in boolean default FALSE )
is
    l_bfile      bfile;
    l_offset  number default 1;
    l_hdr     dbf_header;
    l_flds    field_descriptor_array;
    l_row      rowArray;
begin
    l_bfile := bfilename( p_dir, p_file );
    dbms_lob.fileopen( l_bfile );

    get_header( l_bfile, l_offset, l_hdr, l_flds );

    if ( p_show )
    then
        show( l_hdr, l_flds, p_tname, p_cnames, l_bfile );
    else
        dbms_sql.parse( g_cursor,
                        build_insert(p_tname,p_cnames,l_flds),
                        dbms_sql.native );

        for i in 1 .. l_hdr.no_records loop
            l_row := get_row( l_bfile,
                              l_offset,
                              l_hdr,
                              l_flds );

            if ( l_row(0) <> '*' ) -- deleted record
            then
                for i in 1..l_hdr.no_fields loop
                    dbms_sql.bind_variable( g_cursor,
                                            ':bv'||i,
                                            l_row(i),
                                            4000 );
                end loop;
                if ( dbms_sql.execute( g_cursor ) <> 1 )
                then
                    raise_application_error( -20001,
                                 'Insert failed ' || sqlerrm );
                end if;
            end if;
        end loop;
    end if;

    dbms_lob.fileclose( l_bfile );
exception
    when others then
        if ( dbms_lob.isopen( l_bfile ) > 0 ) then
            dbms_lob.fileclose( l_bfile );
        end if;
        RAISE;
end;

procedure put_header (p_tname        in varchar2,
                        p_cnames       in varchar2 DEFAULT NULL,
                      l_hdr          in out dbf_header,
                      vFlds          in out field_descriptor_array)
                      

                      
is

v_value_list  strTableType;

vCursor   varchar2(2000);
type rc IS ref cursor;
col_cur rc;
i  INTEGER:=0;

begin

IF p_cnames IS NOT NULL THEN
    v_value_list:=str2tbl(UPPER(p_cnames));
    vCursor:='select substr(column_name,1,11), 
                     case data_type 
                     when ''DATE'' then ''D''
                     when ''NUMBER'' then ''N''
                     else ''C'' end ,
                     case data_type
                     when ''NUMBER'' then NVL(data_precision,22)
                     when ''DATE'' then 8
                     else data_length end,
                     case data_type
                     when ''NUMBER'' then data_scale
                     end ,
                     column_name   from all_tab_cols 
                where column_name IN (select * from TABLE 
(cast(str2tbl(UPPER('''||p_cnames||''')) as strTableType)))
                and   table_name='''||upper(p_tname)||'''
                order by column_id';

else

    vCursor:='select SUBSTR(column_name,1,11), 
                     case data_type 
                     when ''DATE'' then ''D''
                     when ''NUMBER'' then ''N''
                     else ''C'' end ,
                     case data_type
                     when ''NUMBER'' then NVL(data_precision,22)
                     when ''DATE'' then 8
                     else data_length end,
                     case data_type
                     when ''NUMBER'' then data_scale
                     end ,
                     column_name
              from all_tab_cols
              where table_name='''||upper(p_tname)||'''
              order by column_id';
END IF;

open col_cur for vCursor;

loop
    i:=i+1;
    fetch col_cur into 
vFlds(i).name,vFlds(i).type,vFlds(i).length,vFlds(i).decimals,vFlds(i).fname;
    exit when col_cur%notfound;
    
end loop;

close col_cur;

l_hdr.version    :='03';
l_hdr.year       :=to_number(to_char(sysdate,'yyyy'))-1900;
l_hdr.month      :=to_number(to_char(sysdate,'mm'));
l_hdr.day        :=to_number(to_char(sysdate,'dd'));
l_hdr.rec_len    :=1; -- to be set later 
l_hdr.no_fields  :=vFlds.COUNT;
l_hdr.hdr_len    :=to_char((l_hdr.no_fields*32)+33,'FM000x');


end;


procedure put_rows (p_tname IN  varchar2,
                      p_where_clause in varchar2 default '1=1 ',
                    vRow in out rowarray,
                    vFlds in field_descriptor_array)

is
type rc is ref cursor;

cur rc;

i  integer:=0;
vSelectList  VARCHAR2(32767);

v_cur VARCHAR2(32767);

begin


for l in 1..vFlds.count loop

    
vSelectList:=vSelectList||ite(l!=1,'||','')||'utl_raw.cast_to_raw(rpad(NVL('|| 
case when vFlds(l).type='N' then 'to_char(' end ||vFlds(l).fname||case when 
vFlds(l).type='N' then ')' end ||','' ''),'||vFlds(l).length||','' ''))';

end loop;

v_cur:='select '||vSelectList||' from '||p_tname||' where '||p_where_clause;

open cur for v_cur;

loop

    i:=i+1;
    fetch cur into vRow(i);
    exit when cur%notfound;
    
end loop;

close cur;

end;


procedure dump_table (p_dir          in varchar2,
                        p_file         in varchar2,
                      p_tname        in varchar2,
                      p_cnames       in varchar2 default NULL,
                      p_where_clause in varchar2 default ' 1=1 ')
                      
    

is
l_hdr dbf_header;
vFlds field_descriptor_array;
vRow  rowarray;
v_cnames  VARCHAR2(4000);
v_outputfile UTL_FILE.FILE_TYPE;

vCount int;

vStartTime  DATE;
vEndTime    DATE;
begin

    vStartTime:=sysdate;

    put_header(p_tname,p_cnames,l_hdr,vFlds);
    
    
    put_rows(p_tname,p_where_clause,vRow,vFlds);
    
    v_outputfile := utl_file.fopen(p_dir,p_file,'w',32767);
    
    for i in 1..vFlds.count loop
    
        l_hdr.rec_len:=l_hdr.rec_len+vFlds(i).length;
        
    end loop;
    
    l_hdr.rec_len    :=to_char(to_number(l_hdr.rec_len),'FM000x');
    
    l_hdr.rec_len   :=substr(l_hdr.rec_len,-2)||
                       substr(l_hdr.rec_len,1,2);
                      
    l_hdr.no_records :=to_char(vRow.count,'FM0000000x'); 
    
    l_hdr.no_records:=substr(l_hdr.no_records,-2)||
                       substr(l_hdr.no_records,5,2)||
                      substr(l_hdr.no_records,3,2)||
                      substr(l_hdr.no_records,1,2);
                      
    
    l_hdr.hdr_len:=substr(l_hdr.hdr_len,-2)||
                       substr(l_hdr.hdr_len,1,2);


    utl_file.put_raw(v_outputFile,
              
rpad(l_hdr.version||to_char(l_hdr.year,'FM0x')||to_char(l_hdr.month,'FM0x')||
              to_char(l_hdr.day,'FM0x')||l_hdr.no_records||l_hdr.hdr_len||
              l_hdr.rec_len,64,'0'));
              
    for i in 1..vFlds.count loop
    
        
utl_file.put_raw(v_outputFile,utl_raw.cast_to_raw(vFlds(i).name)||replace(rpad('0
0',12-length(vFlds(i).name),'#'),'#','00')||
                                      
utl_raw.cast_to_raw(vFlds(i).type)||'00000000'||
                                      
to_char(vFlds(i).length,'FM0x')||'000000000000000000000000000000');
    end loop;
    
    -- terminator for the field names 
    utl_file.put_raw(v_outputFile,'0D');
    
    
    for i in 1..vRow.count loop
    
            
        utl_file.put_raw(v_outputfile,'20'||vRow(i),TRUE);
        
            end loop;
    
    if utl_file.IS_OPEN(v_outputFile ) then
        UTL_FILE.FCLOSE(v_outputFile);
    end if;
    
    vEndTime:=sysdate;
    
    dbms_output.put_line('Started - '||to_char(vStartTime,'HH24:MI'));
    dbms_output.put_line('Finished - '||to_char(vEndTime,'HH24:mi'));
    dbms_output.put_line('Elapsed - 
'||to_char((vEndTime-vStartTime),'hh24:mi'));
exception
when others then
    utl_file.fclose(v_outputFile);
    raise;
    
end;

end;
/

SQL>  exec 
dbase_pkg.dump_table('c:\dbf','test.dbf','all_objects','owner,object_name,subobje
ct_name,
object_id')
BEGIN 
dbase_pkg.dump_table('c:\dbf','test.dbf','all_objects','owner,object_name,subobje
ct_name,objec

*
ERROR at line 1:
ORA-29285: file write error
ORA-06512: at "STEVE.DBASE_PKG", line 564
ORA-06512: at line 1


SQL> 

But if I limit the number of rows:

SQL> exec 
dbase_pkg.dump_table('c:\dbf','test.dbf','all_objects','owner,object_name,subobje
ct_name,o
bject_id','rownum<100')

PL/SQL procedure successfully completed.

SQL> 
 

Followup:
A bit hokey but this seems to work around the issue - which seems to be a line 
buffering issue:

551      for i in 1..vRow.count
552      loop
553          utl_file.put_raw(v_outputfile,'20'||vRow(i),TRUE);
554          if ( mod(i,200) = 0 )
555          then
556              utl_file.fclose( v_outputFile );
557              v_outputfile := utl_file.fopen(p_dir,p_file,'a',32767);
558          end if;
559      end loop;


Suggestion:

o put together the smallest test case possible you can
o submit that to support
o use the workaround for the time being...


 


Rating: 5
  March 17, 2003
Reviewer:  steve  from UK

Thanks Tom - I wondered whether closing and re-opening the file would work but 
thought that I must be missing something. I'll prepare a test case for support 
and post here if they come up with a solution.

Once again, thanks

Steve 


Rating: 4
Did the dbf conversion work?  May 19, 2003
Reviewer:  dharma  from Ca

I am trying to convert an oracle table to dbf format. and am not able to do 
that. Did steves code work? Just curious, I dont want to reinvent the wheel!

Thanks 

Followup:
it is right there for you to try -- go for it. 


Rating: 5
Yes it did  May 19, 2003
Reviewer:  Steve  from UK

although you're still constrained by the 32k limit - Oracle support suggested 
making sure that I had a utl_file.new_line every so often but his, of course, 
added a couple of unwanted bytes to the file - ok for an image but not for dbf.  
So, in the end instead of using utl_file.put_raw, I had to use a small java proc 
to output the file - it works a treat but didn't need 9iR2 to work!! 


Rating: 5
I should add.....  May 20, 2003
Reviewer:  steve  from UK

that Oracle say that the 32k limit is fixed in 10i but that there is no backport 
for it. 


Rating: 5
It Worked!!  May 21, 2003
Reviewer:  dharma  from CA

The first time I tried to do it on my own and messed it up, and then the copy 
paste worked fine :)
Thanks the conversion to dbf worked perfect. added the code for date and the 
workaround for utl_file. 

Got a a good understanding of utl_raw alsow :)

One question that bothers me is the header length is 32 byte, but why do we have 
to allocate 64 bytes. (assumed because its rpad'ed with 64 0's; Is it because 32 
bytes in hex is 64 in ascii?

Thanks to steve too :)

-dharma 


Rating: 5
Blob to File (9.2)  August 06, 2003
Reviewer:  Ray White  from Ottawa,on,ca

Hi, I am using an above snippet (modified)

declare
    vblob blob;
    vstart number:=1;
    bytelen number := 32000;
    len number;
    my_vr raw(32000);
        l_output utl_file.file_type;
    p_dir varchar2(30) default 'f:\gol\ora_vscan';
    p_file varchar2(30) default 'CMsitegarden.pdf';
begin
-- get the blob locator  
    l_output := utl_file.fopen(p_dir, p_file, 'w', 32760);
    for l_cur in (SELECT DOCUMENT_CONTENT mylob FROM gol_attached_document where 
SUPP_DOCUMENT_NUMBER = 4 and rownum =1)
    loop
        len :=  DBMS_LOB.GETLENGTH(l_cur.mylob);
        vblob := l_cur.mylob ;
        dbms_output.put_line('Length of the Column :  ' || to_char(len));
        vstart := 1;
        while (vstart < len) loop   -- loop till entire data is fetched    
        dbms_output.put_line('vstart : ' || to_char(vstart));
            DBMS_LOB.READ(vblob,bytelen,vstart,my_vr);     
            utl_file.put_raw(l_output,my_vr);
            utl_file.fflush(l_output);
            vstart := vstart + bytelen ; 
        end loop;  
        utl_file.fclose(l_output);
    end loop;
end ;

This outputs my file fine. But it will not open in the software associated.

IE a .doc word cannot open
a .pdf acrobat cannot open
the file sizes match the source.

Can you tell me what I am doing wrong. 

for .txt files in the blob do I have to do anything different?

tia
Ray 


Rating: 5
Blob to File  August 06, 2003
Reviewer:  Ray  from ottawa, on,ca

More info on the above.

I looked at the hex of both the source and the after upload and write. Every 
time the source had a 0A , the resulting file had a 0D0A. the resulting file 
also had a 0D0A at the end.


 

Followup:
please contact support for this one -- looks like a windowism -- \r\n is the DOS 
carriage return/linefeed end of line marker.  On unix'es this doesn't happen and 
thats all I got. 


Rating: 5
Java program posted by "Mio from US" is really useful. Thanks Mio & Tom.  December 31, 2003
Reviewer:  Sami  from NJ,USA

 


Rating: 4
getAsciiStream vs. getCharacterStream  January 29, 2004
Reviewer:  Sanjay Jha  from Toronto

Tom,
We are facing something very strange here. We have two identical databases one 
on Windows 2000 and the other on Sun OS. Both are having utf8 characterset. Our 
developers coded an application using getCharacterStream class to access a CLOB 
column and it worked fine on the database running on Windows 2000, but it did 
not show the right data when run against the database running on SunOS, rather 
showed some garbage. 
When we changed to getAsciiStream class, it works fine for both the databases. 
However, we are concerned for the multilingual data (like Chinese etc.) which 
will be stored eventually in this column and retrieving that like Ascii will not 
give us the right result ( since Chinese will use multi-byte storage for 
characters unlike Ascii which uses single byte).
Could you point out where are we missing something? It is surprising that 
operating system has a bearing on something within oracle database, I still do 
not beleve that to be true. What is your opinion and solution if any? 

Regards.    

Followup:
what was the NLS_LANG on the clients 


Rating: 4
We are running Weblogic as application server  January 30, 2004
Reviewer:  Sanjay Jha  from Toronto

Tom,
We are running the application through BEA weblogic server(on Windows 2000). We 
just changed the database connection string for either database while keeping 
the database "client" (weblogic server) constant. The results were different 
though, as mentioned(i.e. garbage comes when we connect to database running on 
Sun OS).  We tried setting NLS_LANG as Windows environmental parameter to utf8 
and also we tried by setting nls_lang to utf8 in the weblogic startup parameter. 
No difference in the outcome. 
Pleaae help.

Regards. 

Followup:
should not happen -- all things constant -- same DB character sets (please 
verify that -- select * from nls_database_parameters on both and verify) 


Rating: 4
NLS settings are identical...  January 30, 2004
Reviewer:  Sanjay Jha  from Toronto

Tom,
I have just compared the two, they are identical. 
One more piece of information. I had set nls_lang=.utf8 and then exported the 
data from database on Windows(characterset utf8) and then imported into the 
schema in database running on Sun OS having characterset utf8. Will it corrupt 
the data for LOB columns if it is more than 4000 bytes, since it will be stored 
outside the database? Does that have to do anything with the problem? However, 
problem is only when we want to access it using getCharacterStream and not when 
using getAsciiStream class.
I know it might be little confusing but there is definitely something to do with 
character encoding which behaves differently for data being accessed from two 
different databases, which are identical but running on different OS.
Any help will be highly appreciated, Tom. 

Regards.


tester@TEST_Win8i> select * from v$version;

BANNER
----------------------------------------------------------------
Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
PL/SQL Release 8.1.7.0.0 - Production
CORE    8.1.7.0.0       Production
TNS for 32-bit Windows: Version 8.1.7.0.0 - Production
NLSRTL Version 3.4.1.0.0 - Production

tester@TEST_Win8i> select * from nls_database_parameters;

PARAMETER                      VALUE
------------------------------ --------------------------------
NLS_LANGUAGE                   AMERICAN
NLS_TERRITORY                  AMERICA
NLS_CURRENCY                   $
NLS_ISO_CURRENCY               AMERICA
NLS_NUMERIC_CHARACTERS         .,
NLS_CHARACTERSET               UTF8
NLS_CALENDAR                   GREGORIAN
NLS_DATE_FORMAT                DD-MON-RR
NLS_DATE_LANGUAGE              AMERICAN
NLS_SORT                       BINARY
NLS_TIME_FORMAT                HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT           DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT             HH.MI.SSXFF AM TZH:TZM
NLS_TIMESTAMP_TZ_FORMAT        DD-MON-RR HH.MI.SSXFF AM TZH:TZM
NLS_DUAL_CURRENCY              $
NLS_COMP                       BINARY
NLS_NCHAR_CHARACTERSET         UTF8
NLS_RDBMS_VERSION              8.1.7.0.0

18 rows selected.

tester@TEST_SUN8i> select * from v$version;

BANNER
----------------------------------------------------------------
Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
PL/SQL Release 8.1.7.0.0 - Production
CORE    8.1.7.0.0       Production
TNS for Solaris: Version 8.1.7.0.0 - Production
NLSRTL Version 3.4.1.0.0 - Production

tester@TEST_SUN8i> select * from nls_database_parameters;

PARAMETER                      VALUE
------------------------------ --------------------------------
NLS_LANGUAGE                   AMERICAN
NLS_TERRITORY                  AMERICA
NLS_CURRENCY                   $
NLS_ISO_CURRENCY               AMERICA
NLS_NUMERIC_CHARACTERS         .,
NLS_CHARACTERSET               UTF8
NLS_CALENDAR                   GREGORIAN
NLS_DATE_FORMAT                dd-mon-rrrr hh24:mi:ss
NLS_DATE_LANGUAGE              AMERICAN
NLS_SORT                       BINARY
NLS_TIME_FORMAT                HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT           DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT             HH.MI.SSXFF AM TZH:TZM
NLS_TIMESTAMP_TZ_FORMAT        DD-MON-RR HH.MI.SSXFF AM TZH:TZM
NLS_DUAL_CURRENCY              $
NLS_COMP                       BINARY
NLS_NCHAR_CHARACTERSET         UTF8
NLS_RDBMS_VERSION              8.1.7.0.0

18 rows selected. 

Followup:
it is not stored outside the database.  it is stored "out of line" in the 
database.

I'll have to refer you to support as I do nothing pretty much with java -- 
(well, as little as humanly possible anyway). 


Rating: 4
I found the problem....  February 03, 2004
Reviewer:  Sanjay Jha  from Toronto

Tom,
Good news is that I figured out the problem and everything is working, the bad 
news is that I still do not understand the reason clearly.
As I had mentioned before, that I have two identical database with utf8 
character set running on Windows 2000 and Sun OS. I logged in to Windows server 
and first set nls_lang=.utf8 and then exported the entire schema (the dump file 
was created on the Windows platform from the databse running on Windows). Then 
during the same session, I imported the data into the database running on the 
Sun OS. Everything worked fine (including some data having Chinese characters) 
in the database running on Sun OS, except the CLOB data, which was showing as 
garbage.

Dump from database running on Sun OS: 
DUMP(DBMS_LOB.SUBSTR(JOB_OFFER_DESC,10,1),1016) 
-------------------------------------------------------------------------------- 

-------------------- 
Typ=1 Len=20 CharacterSet=UTF8: 
c5,8c,c5,a1,c4,a0,c5,b4,c5,a9,c5,b4,c5,b5,c5,ac,c5,a1,c5,a9 


Dump from database running on Windows OS: 
DUMP(DBMS_LOB.SUBSTR(JOB_OFFER_DESC,10,1),1016) 
-------------------------------------------------------------------------------- 

-------------------- 
Typ=1 Len=10 CharacterSet=UTF8: 4c,61,20,74,69,74,75,6c,61,69 
 
However, I did little different this time, I ftp'd the dump file from Windows 
platform to Sun box in the binary mode. Then logged on to the Sun OS and 
imported into the database running on Sun OS. Everything works fine now!

Incidentally, when I tried to export the data from the database running on 
Windows, while logged on to the Sun OS, I received following error:
EXP-00056: ORACLE error 3120 encountered 
ORA-03120: two-task conversion routine: intege 
Export terminated successfully with warnings. 


As I mentioned before, I resolved the issue in trial and error fashion but not 
understanding what was happening. May be you can throw some light on this.

Regards.  

Followup:
if you got an error like that last one -- that would be best for support.

do you have the "header" of the imp session from windows to unix?  I'd like to 
see the character sets in use at that point. 


Rating: 5
Good english... from an Indian. after a long time.  February 04, 2004
Reviewer:  Zaphod B.

 


Rating: 3
  February 05, 2004
Reviewer:  Sanjay Jha  from Toronto

Tom,
I had opened a TAR already. But your response is much faster ;)
Here is the header of the import log:
*****************************************************************
Connected to: Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
With the Partitioning option
JServer Release 8.1.7.0.0 - Production

Export file created by EXPORT:V08.01.07 via conventional path
import done in UTF8 character set and UTF8 NCHAR character set
****************************************************************

Oracle Support is indicating that it had something to do with the buffer size 
what I had set up and possible solution they are proposing is to reduce that 
number in a trial and error fashion. Here is what they said:


Solution Explanation
=====================

Specify the BUFFER parameter in the Export/Import command line or in the 
Export/Import parameter file. Reduce the value of BUFFER until the error is 
resolved.

For example:

If BUFFER is set to 10485760 (10 MB), this will cause Export to buffer 10MB of 
data before writing it to the export dump file. This would cause Import to 
buffer 10MB of data before actually sending it to the database (write to 
datafiles). This becomes a problem when doing Export/Import remotely. 10MB of 
data will try to be sent across the network at one time. On the average a 
network packet (buffer) size is 1500 bytes. So trying to send 10MB at one time 
via 1500 byte packets is causing the overflow to occur.


Tom, could you explain that what is the difference in export/import being done 
remotely (via SQL*Net connection) and when done locally, in terms of buffer 
size? Does it have any bearing on the characterset encoding?

Regards,

Sanjay  

Followup:
i'm skeptical of that one (the buffer size) 



February 05, 2004 Sankalp -- Thanks for the question regarding "Summary Tables/materialized Views/OCI", version 8i

你可能感兴趣的:(File)