Lesson 3 | Working with Filter Streams in Java |
Objective | Understand the purpose of Java filter stream classes and how to use them effectively. |
Purpose of Java Filter Stream Classes
Filter streams in the java.io
package allow developers to transform or extend the behavior of input and output streams.
They provide reusable building blocks for tasks such as buffering, data conversion, or formatting.
Multiple filters can be chained together, creating a flexible pipeline for processing data.
Buffered Reads and Writes
BufferedInputStream
and
BufferedOutputStream
reduce the overhead of frequent disk or network operations by
storing data in an internal buffer. Applications read or write from the buffer instead of accessing the underlying device directly.
This improves performance and also enables features like rereading buffered data:
try (BufferedInputStream in = new BufferedInputStream(new FileInputStream("data.txt"))) {
int b;
while ((b = in.read()) != -1) {
System.out.print((char) b);
}
}
FilterInputStream and FilterOutputStream
FilterInputStream
and FilterOutputStream
are abstract superclasses for filters.
They delegate work to an underlying stream, while subclasses implement specialized behavior.
For example, BufferedInputStream
and DataInputStream
extend these classes to add buffering and data-type support.
Although polymorphism is possible, filter streams are usually referenced via the more general InputStream
or OutputStream
APIs.
Reading and Writing Data Types
DataInputStream
and DataOutputStream
provide methods for reading and writing primitive data types
(int
, double
, UTF-8 Strings
) in a machine-independent format, ensuring portability across platforms.
Portable Data Formats
Additional Useful Filter Streams
Summary
Filter streams provide flexibility by layering functionality over raw streams.
They are essential for performance (buffering), data portability (DataInput/DataOutput),
and developer convenience (PrintStream, PushbackInputStream).
By chaining filters, Java developers can create powerful and efficient data-processing pipelines.
