Complete Type Mapping Table
| Oracle Type | HANA Equivalent | Size / Precision | Notes |
|---|---|---|---|
VARCHAR2(n) | NVARCHAR(n) | n characters (Unicode) | HANA is always Unicode. VARCHAR2 size in Oracle is bytes; NVARCHAR in HANA is characters. 1 Euro symbol = 1 character in NVARCHAR but multiple bytes in VARCHAR2. |
CHAR(n) | NCHAR(n) | Fixed-length | Rarely used in modern systems. Prefer variable-length. |
NUMBER(p,s) | DECIMAL(p,s) | Precision p, scale s | Direct equivalent. DECIMAL is fixed-point (accurate for money). |
NUMBER (no p,s) | DECIMAL(38,10) or BIGINT | See notes | If used for integers, use BIGINT. If used for decimals, choose DECIMAL size based on data audit. |
INTEGER | INTEGER | -2^31 to 2^31-1 | Identical. |
BIGINT | BIGINT | -2^63 to 2^63-1 | Both have it. Use for large integers. |
FLOAT | REAL or DOUBLE | Floating-point | Avoid for money. Use DECIMAL instead. |
DATE | DATE or TIMESTAMP | See gotcha below | CRITICAL: Oracle DATE has time; HANA DATE does not. |
TIMESTAMP | TIMESTAMP | Date and time | Identical concept. Be aware of timezone handling. |
CLOB | NCLOB | Large text | Oracle CLOB can be byte or character; HANA NCLOB is always character (Unicode). |
BLOB | BLOB | Large binary | Identical. |
BOOLEAN | Not native; use TINYINT | 0 or 1 | HANA has no native BOOLEAN. Workaround: use TINYINT, CHAR(1), or INTEGER. |
The DATE Gotcha: Oracle vs HANA
This is the single most important type difference. It causes silent data loss if not handled correctly.
What's the Difference?
Oracle DATE: Stores both date and time (hour, minute, second). Resolution: 1 second.
HANA DATE: Stores only date (year, month, day). Resolution: 1 day. No time component.
Real-World Example
Callidus commission_date column: you store a transaction timestamp like "2026-04-08 14:30:45" (April 8, 2:30:45 PM).
-- Insert INSERT INTO cs_commissions VALUES (1, TO_DATE('2026-04-08 14:30:45', 'YYYY-MM-DD HH24:MI:SS')); -- Query returns SELECT commission_date FROM cs_commissions; -- Result: 2026-04-08 14:30:45
When you migrate this to HANA and use DATE type:
-- Insert (time component is silently dropped) INSERT INTO cs_commissions VALUES (1, TO_DATE('2026-04-08 14:30:45', 'YYYY-MM-DD HH24:MI:SS')); -- Query returns SELECT commission_date FROM cs_commissions; -- Result: 2026-04-08 (no time!)
The time (14:30:45) is gone. If you're aggregating by date, the wrong date, it won't matter. But if you're calculating time differences or filtering by specific times, you've lost data.
The Solution
Audit your data first. For each column that's DATE in Oracle, ask: does it store time information that matters?
-- Check if commission_date has non-zero time components SELECT COUNT(*) FROM cs_commissions WHERE TRUNC(commission_date) != commission_date; -- If count > 0, the column has time data that you'll lose on migration
If the result is > 0, map that column to TIMESTAMP in HANA instead of DATE:
INSERT INTO cs_commissions VALUES (1, TO_TIMESTAMP('2026-04-08 14:30:45', 'YYYY-MM-DD HH24:MI:SS')); SELECT commission_date FROM cs_commissions; -- Result: 2026-04-08 14:30:45 (time preserved)
NUMBER Precision and Scale Issues
Oracle's NUMBER(p,s) is flexible — you can omit s (scale) and it becomes a variable-precision decimal. HANA's DECIMAL requires both p (precision) and s (scale).
CREATE TABLE payroll ( total_comp NUMBER(12, 2), -- What about this one? Maximum precision, no scale specified employee_id NUMBER, tax_rate NUMBER(5) );
In Oracle:
NUMBER(12,2)= max 12 digits, 2 after decimal. Maximum value: 9,999,999,999.99NUMBER= effectively (38,127): huge, variable precision.NUMBER(5)= 5 digits, no scale. So 0-99999 (used as integer).
In HANA, you must specify both:
CREATE TABLE payroll ( total_comp DECIMAL(12, 2), employee_id BIGINT, -- Large integer, not DECIMAL tax_rate DECIMAL(5, 3) -- 5 digits, 3 after decimal (0.000-99.999) );
SELECT MAX(column_name), MIN(column_name) FROM table WHERE column_name IS NOT NULL to understand the actual range. Then choose DECIMAL(p,s) or BIGINT based on the data, not just the Oracle definition.Character Encoding: NLS_CHARACTERSET vs Unicode
Oracle's Character Set
Oracle respects the database NLS_CHARACTERSET parameter. Common values:
US7ASCII: 7-bit ASCII (no accents, no international characters).WE8MSWIN1252: Windows Western European (covers most Latin-based languages).UTF8: Unicode (full international support).AL32UTF8: AL (all languages) 32-bit UTF-8 (full Unicode).
Your Callidus instance probably uses UTF8 or AL32UTF8 (modern default). But if it's older, it might be WE8MSWIN1252 or US7ASCII.
HANA's Character Encoding
HANA is always Unicode (CESU-8 variant). All string types (NVARCHAR, NCHAR) are Unicode-native.
Implications
If your Callidus database is US7ASCII, special characters (é, ñ, ü, Chinese, Arabic, etc.) are not stored. Your data doesn't have them. Migration is straightforward.
If your Callidus database is UTF8 or AL32UTF8, you have international characters. HANA's Unicode encoding is compatible, so migration is usually fine. But test with real data (names, descriptions, addresses with special characters).
Implicit Type Conversions
Oracle is forgiving with type conversions. HANA is stricter. This can cause migration queries to fail.
String to Number
SELECT '100' + 50; -- Converts '100' to 100, returns 150
SELECT CAST('100' AS INTEGER) + 50; -- Explicit conversion
String to Date
SELECT * FROM cs_transactions WHERE created_date = '2026-04-08'; -- Converted using NLS_DATE_FORMAT
SELECT * FROM cs_transactions WHERE created_date = TO_DATE('2026-04-08', 'YYYY-MM-DD'); -- Explicit
BOOLEAN: No Native Support
HANA has no native BOOLEAN type. Workarounds:
| Workaround | Pros | Cons |
|---|---|---|
TINYINT (0/1) | Storage-efficient. Numeric comparisons work. | Not semantically clear (is 0 false or a numeric zero?). Allows invalid values like 2. |
CHAR(1) ('Y'/'N') | Semantically clear. Human-readable. | Slower comparisons (string vs numeric). |
INTEGER (0/1) | Clear. 32-bit standard integers. | Wastes space compared to TINYINT. |
Recommendation: use TINYINT with CHECK constraint:
CREATE TABLE cs_participants ( id INTEGER, is_active TINYINT CHECK (is_active IN (0, 1)) );
Data Migration Script Pattern
When migrating schema, use this pattern to handle type conversions explicitly:
-- Step 1: Create staging table with HANA types CREATE TABLE cs_commissions_staging ( id INTEGER, salesperson_id INTEGER, commission_amount DECIMAL(10,2), created_date TIMESTAMP, -- Not DATE (preserves time) status NVARCHAR(20), -- Not VARCHAR2 is_approved TINYINT -- Not BOOLEAN ); -- Step 2: Insert from Oracle with explicit conversions INSERT INTO cs_commissions_staging SELECT id, salesperson_id, CAST(commission_amount AS DECIMAL(10,2)), TO_TIMESTAMP(created_date, 'YYYY-MM-DD HH24:MI:SS'), CAST(status AS NVARCHAR(20)), CASE WHEN is_approved = 'Y' THEN 1 ELSE 0 END FROM oracle_link.cs_commissions;
Migration Checklist: Data Types
- Audit all DATE columns: For each DATE in Oracle, run the "TRUNC != column" query to detect time components.
- Decide DATE vs TIMESTAMP: If time data exists and matters, use TIMESTAMP in HANA.
- Audit NUMBER columns: Understand max/min values to choose DECIMAL(p,s) or BIGINT.
- Map VARCHAR2 to NVARCHAR: Specify the same length, but understand it's now character-based (not byte-based).
- Handle BOOLEAN: Choose TINYINT, CHAR(1), or INTEGER. Add CHECK constraints.
- Validate character encoding: Test migration with data containing non-ASCII characters (if applicable).
- Write explicit conversion queries: Don't rely on implicit conversions. Use CAST, TO_DATE, TO_TIMESTAMP.
- Test data migration on copy: Migrate a production copy to HANA test environment. Validate row counts, sample values, and calculated fields match Oracle.
Post-Migration Validation
After migrating data, run these queries to validate:
-- Oracle SELECT COUNT(*) FROM cs_commissions; -- e.g., 5000000 -- HANA SELECT COUNT(*) FROM cs_commissions; -- Must be 5000000
-- Oracle SELECT COUNT(*) FROM cs_commissions WHERE created_date BETWEEN TO_DATE('2026-01-01', 'YYYY-MM-DD') AND TO_DATE('2026-01-31', 'YYYY-MM-DD'); -- HANA — should match SELECT COUNT(*) FROM cs_commissions WHERE CAST(created_date AS DATE) BETWEEN TO_DATE('2026-01-01', 'YYYY-MM-DD') AND TO_DATE('2026-01-31', 'YYYY-MM-DD');