Skip to main content

Microsoft Fabric Reference

Reference material for Microsoft Fabric Publishing -- data type mappings, querying, troubleshooting, and CLI arguments.

Data Type Mapping

Source types are mapped to Fabric-compatible Delta Lake types.

PostgreSQL to Fabric

PostgreSQL TypeDelta Lake Type
INTEGER, INT4INT
BIGINT, INT8BIGINT
SMALLINT, INT2SMALLINT
NUMERIC(p,s)DECIMAL(p,s)
REAL, FLOAT4FLOAT
DOUBLE PRECISIONDOUBLE
VARCHAR(n), TEXTSTRING
DATEDATE
TIMESTAMPTIMESTAMP
TIMESTAMPTZTIMESTAMP
BOOLEANBOOLEAN
BYTEABINARY

SQL Server to Fabric

SQL Server TypeDelta Lake Type
INTINT
BIGINTBIGINT
SMALLINTSMALLINT
TINYINTTINYINT
DECIMAL(p,s)DECIMAL(p,s)
FLOATDOUBLE
REALFLOAT
VARCHAR(n), NVARCHAR(n)STRING
DATEDATE
DATETIME, DATETIME2TIMESTAMP
DATETIMEOFFSETTIMESTAMP
BITBOOLEAN
VARBINARYBINARY

Querying Tables

SQL Analytics Endpoint

-- Query a Delta table
SELECT * FROM your_lakehouse.dbo.customer LIMIT 10;

-- Aggregation
SELECT
c_nationkey,
COUNT(*) as customer_count,
SUM(c_acctbal) as total_balance
FROM your_lakehouse.dbo.customer
GROUP BY c_nationkey
ORDER BY customer_count DESC;

Spark Notebooks

# Read a Delta table
df = spark.read.table("customer")
df.show(10)

# Query with SQL
spark.sql("""
SELECT c_nationkey, COUNT(*) as cnt
FROM customer
GROUP BY c_nationkey
""").show()

# Write results back
result_df.write.mode("overwrite").saveAsTable("customer_summary")

Troubleshooting

Common Issues

Authentication Errors

Error: AADSTS7000215: Invalid client secret provided

Regenerate your client secret in Azure AD and update credentials.json.

Error: The user or service principal does not have access to the workspace

  1. Verify the Service Principal is added to the workspace
  2. Ensure it has Member or Contributor role
  3. Wait a few minutes for permissions to propagate

OneLake Connection Issues

Error: Unable to connect to OneLake storage

  1. Verify the directory format: onelake://workspace-name/lakehouse-name/
  2. Check that the Service Principal has Storage Blob Data Contributor role
  3. Ensure workspace and lakehouse names are correct (case-sensitive)

Table Creation Failures

Error: Failed to create table in Lakehouse

  1. Verify the SQL analytics endpoint is correct
  2. Check that the Lakehouse is active (not paused)
  3. Ensure sufficient capacity units are available
  4. Confirm Parquet files were exported to OneLake

SQL Endpoint Connection Issues

Error: Cannot connect to SQL analytics endpoint

  1. Verify the SQL endpoint hostname
  2. Ensure the SQL analytics endpoint is enabled
  3. Check firewall rules if connecting from on-premises

Verifying Export Success

Check files in OneLake:

  1. Open your Lakehouse in Fabric portal
  2. Go to the Files section
  3. Verify Parquet files exist in the expected folders

Check tables:

  1. Open the SQL analytics endpoint
  2. Expand Tables in the object explorer
  3. Verify your tables appear

Debug Mode

./LakeXpress sync --log_lev DEBUG

CLI Reference

Fabric Publishing Arguments

OptionTypeDescription
--publish_target IDStringCredential ID for Fabric publishing (required)
--publish_method METHODEnuminternal (Delta tables) or external (SQL views)
--publish_table_pattern PATTERNStringTable naming pattern (default: {table})
--n_jobs NIntegerParallel workers for table creation (default: 1)

Full Example

./LakeXpress config create \
-a credentials.json \
--lxdb_auth_id lxdb_postgres \
--source_db_auth_id source_postgres \
--source_db_name tpch \
--source_schema_name tpch_1 \
--fastbcp_dir_path ./FastBCP_linux-x64/latest/ \
--fastbcp_p 2 \
--n_jobs 4 \
--target_storage_id onelake_storage \
--publish_target fabric_lakehouse \
--publish_method internal \
--publish_table_pattern "{schema}_{table}" \
--generate_metadata

./LakeXpress sync

See Also

Copyright © 2026 Architecture & Performance.