Skip to content

Fixing "cannot access class sun.nio.ch.DirectBuffer" in Apache Spark with Java 17

This article explains how to resolve the IllegalAccessError that occurs when running Apache Spark 3.3.0+ applications with Java 17, providing practical solutions for different development environments.

Problem Description

When running Apache Spark 3.3.0 or later with Java 17, you may encounter the following error:

Caused by: java.lang.IlllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x59d016c9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x59d016c9

This occurs because Spark uses internal Java APIs that are no longer accessible by default in Java 17's module system. The sun.nio.ch.DirectBuffer class is part of Java's internal API and is not exposed to external modules without explicit permission.

Solutions

1. Add Java Module System Options

The primary solution is to add JVM arguments that open the necessary internal packages to Spark. Here are the essential module exports required:

bash
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED

For more comprehensive compatibility, especially in complex applications, use this extended set of options:

bash
--add-opens=java.base/java.lang=ALL-UNNAMED
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED
--add-opens=java.base/java.io=ALL-UNNAMED
--add-opens=java.base/java.net=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
--add-opens=java.base/java.util=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED
--add-opens=java.base/sun.security.action=ALL-UNNAMED
--add-opens=java.base/sun.util.calendar=ALL-UNNAMED
--add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED

2. Environment-Specific Implementations

Command Line Execution

When running Spark applications from the command line:

bash
java \
    --add-opens=java.base/sun.nio.ch=ALL-UNNAMED \
    -jar your-spark-app.jar

IntelliJ IDEA

Add the VM options in your run configuration:

  1. Open Run/Debug Configurations
  2. Select your Spark application configuration
  3. In VM options field, add:
    --add-opens=java.base/sun.nio.ch=ALL-UNNAMED

Visual Studio Code

Edit your launch.json file:

json
{
  "version": "0.2.0",
  "configurations": [
    {
      "type": "java",
      "name": "Main",
      "request": "launch",
      "mainClass": "com.spark.Main",
      "vmArgs": ["--add-opens=java.base/sun.nio.ch=ALL-UNNAMED"]
    }
  ]
}

Maven Projects

For Maven projects, you have several options:

Option 1: Use .mvn/jvm.config Create a .mvn folder in your project root and add a jvm.config file with:

--add-opens=java.base/sun.nio.ch=ALL-UNNAMED

Option 2: Configure surefire plugin for tests

xml
<plugin>
   <groupId>org.apache.maven.plugins</groupId>
   <artifactId>maven-surefire-plugin</artifactId>
   <argLine>        
        --add-opens=java.base/sun.nio.ch=ALL-UNNAMED
   </argLine> 
</plugin>

Gradle Projects

For Gradle applications:

gradle
application {
    applicationDefaultJvmArgs = [
        "--add-opens=java.base/sun.nio.ch=ALL-UNNAMED"
    ]
}

For Gradle tests:

kotlin
tasks.test {
    jvmArgs = listOf("--add-opens=java.base/sun.nio.ch=ALL-UNNAMED")
}

Docker Environments

Set the environment variable in your Dockerfile:

dockerfile
ENV JDK_JAVA_OPTIONS="--add-opens=java.base/sun.nio.ch=ALL-UNNAMED"

SBT Projects

Option 1: Environment variable

bash
export JAVA_OPTS='--add-exports java.base/sun.nio.ch=ALL-UNNAMED'
sbt run

Option 2: .jvmopts file Create a .jvmopts file in your project root:

--add-exports java.base/sun.nio.ch=ALL-UNNAMED

3. Alternative Approach: Downgrade to Java 11

If adding JVM arguments is not feasible in your environment (such as certain cloud platforms), consider using Java 11 instead:

WARNING

While Spark officially supports Java 17, some deployment environments may not allow custom JVM arguments, making Java 11 a practical alternative in these cases.

Version Compatibility Notes

  • Spark 3.3.0-3.3.1: Requires the module system options outlined above
  • Spark 3.3.2+: Some improvements, but may still require module options
  • Spark 3.5.0: Tested with Java 17 and 21 with appropriate module options
  • Java 17: Fully supported with the solutions provided
  • Java 19/21: May work for smaller projects but not officially supported

Best Practices

  1. Start with the minimal option (--add-opens=java.base/sun.nio.ch=ALL-UNNAMED) and only add more if needed
  2. Test thoroughly after applying these changes
  3. Consider future compatibility - newer Spark versions may resolve these issues
  4. Document these requirements in your project's setup instructions

Conclusion

The "cannot access class sun.nio.ch.DirectBuffer" error in Spark with Java 17 is a module system compatibility issue that can be resolved by adding appropriate JVM arguments. The solutions provided cover various development and deployment environments, allowing you to continue using modern Java versions with Spark while maintaining application stability.

For the most current information, always check the Apache Spark documentation and Java compatibility notes.