Reading a File
About
Reading files is a common requirement in many applications—whether it’s configuration, data imports, templates, or logs. In Spring Boot, how we read a file depends on where the file is located and how we want to read it.
Read a File Based on Its Location
1. Reading a File from the resources
Folder (Classpath)
resources
Folder (Classpath)Files placed in the src/main/resources
folder in a Spring Boot application are automatically included in the application's classpath at build time. These files are bundled inside the final JAR or WAR and are accessible using classpath-based resource loading techniques. This is suitable for configuration files, templates, static data, or test inputs that are part of the application package.
Accessing these files requires using ClassPathResource
or Spring's ResourceLoader
. We cannot use File
directly on these resources after packaging, as they no longer exist on the file system.
Example
Using ClassPathResource
:
import org.springframework.core.io.ClassPathResource;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
public String readFileFromClasspath() throws IOException {
ClassPathResource resource = new ClassPathResource("data/sample.txt");
try (InputStream inputStream = resource.getInputStream()) {
return new String(inputStream.readAllBytes(), StandardCharsets.UTF_8);
}
}
Using ResourceLoader
(if dependency injection is available):
@Autowired
private ResourceLoader resourceLoader;
public String readUsingResourceLoader() throws IOException {
Resource resource = resourceLoader.getResource("classpath:data/sample.txt");
return new String(resource.getInputStream().readAllBytes(), StandardCharsets.UTF_8);
}
2. Reading a File from File System (Absolute or Relative Path)
Files located outside the application (for example, in a shared folder or logs directory) can be read directly using standard Java IO or NIO APIs. This method is used when the file is dynamic, user-uploaded, or generated at runtime.
An absolute path points to an exact location in the file system (/opt/app/data.txt
), while a relative path is resolved based on the directory from which the application was started.
This is ideal for processing large files, data imports, runtime logs, or reports generated by other systems.
Example
Reading using java.nio.file.Files
:
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.charset.StandardCharsets;
public String readFileFromAbsolutePath() throws IOException {
Path path = Path.of("/opt/app/data/sample.txt");
return Files.readString(path, StandardCharsets.UTF_8);
}
Reading with traditional IO:
import java.io.FileInputStream;
public String readFileOldWay(String path) throws IOException {
try (FileInputStream fis = new FileInputStream(path)) {
return new String(fis.readAllBytes(), StandardCharsets.UTF_8);
}
}
3. Reading a File from a System Property or Environment Variable
In Spring Boot, external configuration (like file paths) is often provided using environment variables or system properties. This allows the file path to be dynamic, configurable per environment (dev/test/prod), and not hardcoded in the source code.
We can inject such configuration using the @Value
annotation or access it via Environment
or System.getProperty()
.
Example
Using @Value
:
@Value("${app.input-file}")
private String filePath;
public String readFromConfigPath() throws IOException {
return Files.readString(Path.of(filePath), StandardCharsets.UTF_8);
}
In application.properties
:
app.input-file=/opt/app/data/sample.txt
Or using Java system property:
String path = System.getProperty("my.input.file");
String content = Files.readString(Path.of(path), StandardCharsets.UTF_8);
Command line:
java -Dmy.input.file=/opt/app/data/sample.txt -jar app.jar
4. Reading a File from a URL or Remote Source
Sometimes, files are not stored locally but are hosted on external systems over HTTP, HTTPS, or FTP. In such cases, the file can be read using Java’s URL and stream APIs.
This is useful for reading content from a public endpoint, content server, CDN, cloud storage, or even internal microservices that expose files as endpoints.
Always handle network-related exceptions and add timeouts to avoid application hangs due to unreachable sources.
Example
Using java.net.URL
and BufferedReader
:
import java.net.URL;
import java.io.InputStreamReader;
import java.io.BufferedReader;
public String readFileFromUrl() throws IOException {
URL url = new URL("https://example.com/data/sample.txt");
try (BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream(), StandardCharsets.UTF_8))) {
return reader.lines().collect(Collectors.joining("\n"));
}
}
Add timeouts using HttpURLConnection
if needed.
5. Reading a File Uploaded via API (MultipartFile
)
MultipartFile
)When users upload files via HTTP POST requests, Spring Boot maps the uploaded file to a MultipartFile
object. This object provides methods to access the file's content, metadata (name, type), and streams.
This is commonly used in REST APIs where users upload CSVs, documents, or images that need to be processed, stored, or validated.
The file is stored temporarily in memory or disk by Spring, depending on its size and configuration.
Example
REST controller method to receive and read the uploaded file:
@PostMapping("/upload")
public ResponseEntity<String> handleFileUpload(@RequestParam("file") MultipartFile file) throws IOException {
String content = new String(file.getBytes(), StandardCharsets.UTF_8);
return ResponseEntity.ok(content);
}
We can also use:
file.getInputStream()
– for streaming large filesfile.getOriginalFilename()
– for logging or renaming
This works with HTML forms or multipart REST clients like Postman.
6. Reading a File from a Remote SFTP Location
SFTP (SSH File Transfer Protocol) is a secure way to access files on a remote server. To read a file from an SFTP location in a Spring Boot application, we typically use Java libraries that support SFTP, such as:
JSch (Java Secure Channel) – lightweight and commonly used
Apache Commons VFS
Spring Integration SFTP – part of Spring Integration for more advanced or reactive SFTP processing
The file is accessed over a secure SSH session, and we need the server address, port, username, password (or private key), and file path.
Read a File Based on Reading Technique
Reading a file can be done in multiple ways depending on how much content we want to read at once, how we plan to process it, and whether we expect text or binary content.
1. Reading Whole File into a String
This is the simplest and most direct method to read an entire file into memory as a single string. It is suitable for small to medium-sized files where the content needs to be processed as a whole (e.g., templates, config files, JSON, XML, SQL scripts).
This approach is not recommended for very large files, as it loads the entire file into RAM.
Example
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.charset.StandardCharsets;
public class FileReaderExample {
public String readWholeFile(String filePath) throws IOException {
return Files.readString(Path.of(filePath), StandardCharsets.UTF_8);
}
}
2. Reading Line by Line using BufferedReader
BufferedReader
BufferedReader
is used to read a file line by line efficiently. It buffers the input from the file to avoid frequent disk access, making it memory-efficient and suitable for large files such as logs or data imports.
It provides better control over processing each line separately, which is useful for transformation, filtering, or validation tasks.
Example
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class BufferedReaderExample {
public void readLineByLine(String filePath) throws IOException {
try (BufferedReader reader = new BufferedReader(new FileReader(filePath))) {
String line;
while ((line = reader.readLine()) != null) {
System.out.println("Line: " + line);
}
}
}
}
3. Reading Using Java 8 Streams
Java 8 introduced the Files.lines()
method, which returns a Stream<String>
. This allows us to use functional programming techniques like map
, filter
, collect
, etc. It is ideal for pipeline-based data processing and allows efficient, lazy evaluation of large files.
This is a good choice when we need line-by-line logic combined with concise functional transformations.
Example
import java.nio.file.Files;
import java.nio.file.Path;
import java.io.IOException;
import java.util.stream.Stream;
public class StreamReaderExample {
public void processLines(String filePath) throws IOException {
try (Stream<String> lines = Files.lines(Path.of(filePath))) {
lines.filter(line -> !line.isBlank())
.map(String::toUpperCase)
.forEach(System.out::println);
}
}
}
4. Reading Using Scanner
Scanner
Scanner
is a utility class used for parsing text using regular expressions. It is often used to read text token by token—by default, it reads words separated by whitespace, but can be configured with custom delimiters.
Scanner is suitable for parsing structured content like CSV, whitespace-delimited text, or small data files. It is not efficient for large files compared to BufferedReader
.
Example
import java.io.File;
import java.io.IOException;
import java.util.Scanner;
public class ScannerExample {
public void readUsingScanner(String filePath) throws IOException {
try (Scanner scanner = new Scanner(new File(filePath))) {
while (scanner.hasNext()) {
String word = scanner.next();
System.out.println("Token: " + word);
}
}
}
}
5. Reading Using Spring’s Resource
API
Resource
APISpring provides an abstraction for loading files from various locations (classpath, file system, URL, etc.) using the Resource
interface. This approach is especially useful for reading files bundled in our resources/
folder, or when we want to use @Value
injection for configuration files.
This is a Spring-centric and flexible approach that decouples file access from the source (location).
Example
import org.springframework.core.io.Resource;
import org.springframework.core.io.ClassPathResource;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
public class SpringResourceExample {
public String readFromClasspath() throws IOException {
Resource resource = new ClassPathResource("data/sample.txt");
return new String(resource.getInputStream().readAllBytes(), StandardCharsets.UTF_8);
}
}
For property-based injection:
@Value("classpath:data/sample.txt")
private Resource resource;
@PostConstruct
public void init() throws IOException {
String content = new String(resource.getInputStream().readAllBytes(), StandardCharsets.UTF_8);
System.out.println(content);
}
6. Reading a File into Byte Array
When working with binary files such as images, PDFs, Excel, or audio files, we need to read the content into a byte[]
. This method allows we to preserve the exact binary representation of the file without any encoding assumptions.
It can also be used when we want to store files in a database or return them directly in a REST API response (e.g., for downloads).
Example
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
public class ByteArrayExample {
public byte[] readBytes(String filePath) throws IOException {
return Files.readAllBytes(Path.of(filePath));
}
}
For classpath resources:
Resource resource = new ClassPathResource("images/logo.png");
byte[] content = resource.getInputStream().readAllBytes();
Best Practices based on File Size
1. Small Files (up to a few KB)
Characteristics
Usually configuration, templates, metadata, or static data
Can be safely loaded fully into memory
Best Practices
Use
Files.readString(Path)
orInputStream.readAllBytes()
to load the content directly.Prefer
ClassPathResource
orResourceLoader
if the file is inside theresources
folder.Use
@Value("classpath:...")
injection for read-only static files.
Example
String content = Files.readString(Path.of("data/info.txt"));
Recommendation
Keep the file in src/main/resources/
and read it once (e.g., at startup or via @PostConstruct
) if it's used repeatedly.
2. Medium Files (few KB to several MB)
Characteristics
Contains larger datasets like CSV, JSON logs, or XML files
Might need conditional processing, transformation, or parsing
Best Practices
Avoid reading the whole file at once unless required.
Prefer
BufferedReader
for line-by-line reading.Use
Files.lines(Path)
for stream-based processing withfilter
,map
,limit
, etc.Validate and sanitize each line before processing.
Example
try (Stream<String> lines = Files.lines(Path.of("records.csv"))) {
lines.filter(line -> !line.startsWith("#"))
.forEach(this::processLine);
}
Recommendation
Keep memory usage low by streaming the content rather than loading all lines into a list.
3. Large Files (tens or hundreds of MB and above)
Characteristics
Logs, large CSV exports, raw data, batch inputs
Can consume high memory if not handled carefully
Best Practices
Always read line-by-line using
BufferedReader
.Avoid
Files.readAllBytes()
orFiles.readString()
for large files.Stream data to process or transform it incrementally.
If the file is binary (images, PDFs, etc.), stream it using
InputStream
with buffer blocks.Use backpressure-friendly logic for concurrent processing.
Example
try (BufferedReader reader = Files.newBufferedReader(Path.of("bigdata.txt"))) {
String line;
while ((line = reader.readLine()) != null) {
process(line); // Avoid storing all lines in memory
}
}
Recommendation
For large binary files, avoid converting to String
. Process as byte streams or chunked downloads/uploads.
4. Uploaded Files (via REST API)
Best Practices
Use
MultipartFile.getInputStream()
instead ofgetBytes()
for large uploads.Validate file type, size, and content before processing.
Avoid reading large files into memory in one go—stream them.
Write the uploaded file to a temporary directory if needed before further processing.
Example
@PostMapping("/upload")
public ResponseEntity<Void> handleUpload(@RequestParam MultipartFile file) throws IOException {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(file.getInputStream()))) {
String line;
while ((line = reader.readLine()) != null) {
process(line);
}
}
return ResponseEntity.ok().build();
}
5. Binary Files (PDF, image, Excel, etc.)
Best Practices
Use
InputStream
orFiles.readAllBytes()
when we need the exact binary content.If the file is large, stream it to avoid out-of-memory errors.
Set appropriate content type and content disposition when returning in a REST response.
Example
byte[] content = Files.readAllBytes(Path.of("invoice.pdf"));
Why BufferedReader
is Preferred for Large Files ?
BufferedReader
is Preferred for Large Files ?1. Low Memory Usage
It does not load the entire file into memory.
Instead, it reads one line at a time using an internal buffer.
This makes it suitable for files that are hundreds of megabytes or even gigabytes in size.
2. Buffered I/O Improves Performance
BufferedReader
wraps around aReader
(likeFileReader
orInputStreamReader
) and uses a buffer (default: 8 KB) to read chunks of data at once.This reduces the number of I/O operations (disk reads), making it much faster than unbuffered readers.
3. Line-by-Line Processing
Provides the
readLine()
method to easily process each line individually.Allows streaming large files without needing to split or parse the entire file in memory.
4. Suitable for Sequential Processing
Ideal for use cases where we want to:
Parse a log file line by line.
Validate or transform each row in a CSV.
Search for specific content without loading the full file.
5. Avoids OutOfMemoryError
Using
Files.readAllBytes()
orFiles.readString()
on a large file can lead to heap exhaustion andOutOfMemoryError
, especially in memory-constrained environments like containers or small JVMs.BufferedReader
avoids this by only holding small chunks in memory.
try (BufferedReader reader = new BufferedReader(new FileReader("large-file.txt"))) {
String line;
while ((line = reader.readLine()) != null) {
process(line); // Safe, line-by-line
}
}
This approach:
Keeps memory usage predictable and low.
Does not block or crash even with multi-GB files.
Can be paused, streamed, or throttled easily.
Last updated
Was this helpful?