[oracle] ORA-01461: can bind a LONG value only for insert into a LONG column-Occurs when querying

When I try to query objects, I end up with following error:

ORA-01461: can bind a LONG value only for insert into a LONG column

Could someone please help me on the cause and solution of the problem?

This question is related to oracle oracle11g

The answer is


I was facing the same issue and solve it by just replacing VARCHAR with CLOB. This link helped me out.


I have a solution for Java/JPA/eclipselink/oracle when insert a long xml string (>4000) into a XMLTYPE column at Insert XML with more than 4000 characters into a Oracle XMLTYPE column. For clarity, include the same contents here in case the link not working

You need to convert xml string for more than 4000 charcaters into SQLXML type first.

Environment: jpa 2.1.0, eclipselink 2.5.2, oracle db 11gr2

SQL:

CREATE TABLE "XMLTEST"
( "ID" NUMBER(10,0) NOT NULL ENABLE, 
  "DESCRIPTION" VARCHAR2(50 CHAR) NOT NULL ENABLE, 
  "XML_TXT" "XMLTYPE" NOT NULL ENABLE
);

INSERT INTO XMLTEST (ID, DESCRIPTION, XML_TXT) VALUES (101, 'XML DATA', '<data>TEST</data>');
COMMIT;

DROP TABLE "XMLTEST";

Java Code

String sql = "INSERT INTO XMLTEST (ID, DESCRIPTION, XML_TXT) VALUES (?, ?, ?)";
String xmlDataStr = "<data>test...</data>"; // a long xml string with length > 4000 characters
Connection con = getEntityManager().unwrap(Connection.class);
SQLXML sqlXml = con.createSQLXML();
sqlXml.setString(xmlDataStr);

Java code - use PreparedStatement

PreparedStatement pstmt = con.prepareStatement(sql);
pstmt.setLong(1, 201);
pstmt.setLong(2, "Long XML Data");
pstmt.setSQLXML(3, sqlXml);
pstmt.execute();

Java code - use native query instead of PreparedStatement

Query query = getEntityManager().createNativeQuery(sql);
query.setParameter(1, 301);
query.setParameter(2, "Long XML Data");
query.setParameter(3, sqlXml);
query.executeUpdate();

Kiran's answer is definetely the answer for my case.

In code part I split string to 4000 char strings and try to put them in to db.

Explodes with this error.

The cause of the error is using utf chars, those counts 2 bytes each. Even I truncate to 4000 chars in code(sth. like String.Take(4000)), oracle considers 4001 when string contains 'รถ' or any other non-eng(non ascii to be precise, which are represented with two or bytes in utf8) characters.


I had the same problem using PHP and prepared statements on a VARCHAR2 column. My string didn't exceeed the VARCHAR2 size. The problem was that I used -1 as maxlength for binding, but the variable content changed later.

In example:

$sMyVariable = '';
$rParsedQuery = oci_parse($rLink, 'INSERT INTO MyTable (MyVarChar2Column) VALUES (:MYPLACEHOLDER)');
oci_bind_by_name($rParsedQuery, ':MYPLACEHOLDER', $sMyVariable, -1, SQLT_CHR);

$sMyVariable = 'a';
oci_execute($rParsedQuery, OCI_DEFAULT);
$sMyVariable = 'b';
oci_execute($rParsedQuery, OCI_DEFAULT);

If you replace the -1 with the max column width (i. e. 254) then this code works. With -1 oci_bind_by_param uses the current length of the variable content (in my case 0) as maximum length for this column. This results in ORA-01461 when executing.


I had the same problem with Entity Framework database first on all CLOB columns.

As a workaround, I filled the text values with spaces to be at least 4000 in width in insert operations (did not come with any better solution).


This error occurs when one attempts to use a varchar variable longer than 4000 bytes in an SQL statement. PL/SQL allows varchars up to 32767 bytes, but the limit for database tables and SQL language is 4000. You can't use PL/SQL variables that SQL doesn't recognize in SQL statements; an exception, as the message explains, is a direct insert into a long-type column.

create table test (v varchar2(10), c clob);


declare
  shortStr varchar2(10) := '0123456789';
  longStr1 varchar2(10000) := shortStr;
  longStr2 varchar2(10000);
begin
  for i in 1 .. 10000
  loop
    longStr2 := longStr2 || 'X';
  end loop;

  -- The following results in ORA-01461
  insert into test(v, c) values(longStr2, longStr2);

  -- This is OK; the actual length matters, not the declared one
  insert into test(v, c) values(longStr1, longStr1);

  -- This works, too (a direct insert into a clob column)
  insert into test(v, c) values(shortStr, longStr2);

  -- ORA-01461 again: You can't use longStr2 in an SQL function!
  insert into test(v, c) values(shortStr, substr(longStr2, 1, 4000));
end;

A collegue of me and I found out the following:

When we use the Microsoft .NET Oracle driver to connect to an oracle Database (System.Data.OracleClient.OracleConnection)

And we are trying to insert a string with a length between 2000 and 4000 characters into an CLOB or NCLOB field using a database-parameter

oraCommand.CommandText = "INSERT INTO MY_TABLE (NCLOB_COLUMN) VALUES (:PARAMETER1)";
// Add string-parameters with different lengths
// oraCommand.Parameters.Add("PARAMETER1", new string(' ', 1900)); // ok
oraCommand.Parameters.Add("PARAMETER1", new string(' ', 2500));  // Exception
//oraCommand.Parameters.Add("PARAMETER1", new string(' ', 4100)); // ok
oraCommand.ExecuteNonQuery();
  • any string with a length under 2000 characters will not throw this exception
  • any string with a length of more than 4000 characters will not throw this exception
  • only strings with a length between 2000 and 4000 characters will throw this exception

We opened a ticket at microsoft for this bug many years ago, but it has still not been fixed.


I encountered this error message when trying to insert String into an XMLTYPE column.

Specifically using Java's PreparedStatement like this:

ps.setString('XML', document);

where XML here is defined as XMLTYPE.


In my particular case, I was trying to store a Base64 encoded file into a table BLOB field, using Mybatis.

So in my xml I had:

<insert id="save..." parameterType="...DTO">
    <selectKey keyProperty="id" resultType="long" order="BEFORE">
        SELECT SEQ.nextVal FROM DUAL
    </selectKey>
    insert into MYTABLE(
        ID,
        ...,
        PDF
    ) values (
        #{id, jdbcType=VARCHAR},
        ...,
        #{tcPdf, jdbcType=BLOB},
    )
</insert>

and in my DTO:

String getPdf(){
    return pdf;
}

That makes to Mybatis threat as if were a String char sequence and try to store it as a Varchar. So my solution was the following:

In my DTO:

Byte[] getPdf(){
    return pdf.getBytes();
}

And worked.

I hope this could help anybody.


I encountered the same problem using Siebel REXPIMP (registry import) when using the latest Instant Client driver. To fix the issues, use the Siebel provided Data Direct driver instead. The DLL is SEOR823.DLL


It can also happen with varchar2 columns. This is pretty reproducible with PreparedStatements through JDBC by simply

  1. creating a table with a column of varchar2 (20 or any arbitrary length) and
  2. inserting into the above table with a row containing more than 20 characters

So as above said it can be wrong with types, or column width exceeded.

Also note that as varchar2 allows 4k chars max, the real limit will be 2k for double byte chars

Hope this helps


Applications using JDBC 10.1 has got a bug (Doc ID 370438.1) and can throw the same ORA-01461 exception while working with UTF8 character set database even though inserted characters are less than the maximum size of the column.

Recommended Solution: - Use 10gR2 JDBC drivers or higher in such case.

HTH


Adding another use case where I found this happening. I was using a ADF Fusion application and the column type being used was a varchar2(4000) which could not accommodate the text and hence this error.


This ORA-01461 does not occur only while inserting into a Long column. This error can occur when binding a long string for insert into a VARCHAR2 column and most commonly occurs when there is a multi byte(means single char can take more than one byte space in oracle) character conversion issue.

If the database is UTF-8 then, because of the fact that each character can take up to 3 bytes, conversion of 3 applied to check and so actually limited to use 1333 characters to insert into varchar2(4000).

Another solution would be change the datatype from varchar2(4000) to CLOB.