A Stream represents a sequence of elements of some type T that can be processed in a functional, declarative style — not by looping manually.
A stream pipeline has:
Source – a collection, array, or generator
Intermediate operations – transform the data (filter, map, sorted)
Terminal operation – produces a result (collect, forEach, reduce)
import java.util.*;
import java.util.stream.*;
public class StreamExample {
public static void main(String[] args) {
List<String> names = List.of("Alice", "Bob", "Charlie");
// Create a stream from the list
Stream<String> stream = names.stream();
// Process the stream: filter + map + forEach
stream.filter(name -> name.startsWith("C"))
.map(String::toUpperCase)
.forEach(System.out::println);
}
}
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
int sum = numbers.stream()
.filter(n -> n % 2 == 1) // keep odd numbers
.map(n -> n * n) // square them
.reduce(0, Integer::sum); // sum them up
System.out.println(sum);
List<T> | Stream<T>
------------------------------------------| --------------------------------------
A collection of elements stored in memory | A pipeline of elements to be processed
Data structure | Data flow
Evaluation: Eager (immediate) | Lazy (on demand)
Mutability: Can be modified (add/remove) | Cannot be modified — read-only flow
Can iterate multiple times | Can only be used once
Store and access data (object-oriented) | Process and transform data (functional style)
No storage — it pulls data from the list.
Each operation (filter, map, etc.) builds a pipeline.
Operations like filter and map are lazy — they don’t run until a terminal operation (like forEach, reduce, or collect) triggers the pipeline.
List (data in memory)
↓
Stream (processing pipeline)
↓
Terminal operation (collect, forEach, reduce)
List → Stream list.stream()
Stream → List stream.collect(Collectors.toList())
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
List<Integer> squares = numbers.stream()
.map(n -> n * n)
.collect(Collectors.toList());
Once the pipeline has finished flowing — the stream is consumed.
You can’t “rewind” it or start it again. It'll throw: java.lang.IllegalStateException: stream has already been operated upon or closed
Intermediate Operations: filter(), map(), sorted(), distinct() → Transform the stream into another stream (lazy).
Intermediate operations are lazy and can be chained. They don’t execute until a terminal operation is called.
Terminal Operations: forEach(), reduce(), collect(), count() → Produce a result or side-effect (like printing) and consume the stream (the stream cannot be reused).
Terminal operations trigger the evaluation of the pipeline.
These operations are functional interfaces implemented by lambdas:
Predicate<T> → boolean test(T t)
Function<T, R> → R apply(T t)
Consumer<T> → void accept(T t)
Supplier<T> → T get()
Comparator<T> → int compare(T a, T b)
filter(Predicate<? super T> predicate) —— stream.filter(n -> n > 0)
Keep only elements that match the predicate
map(Function<? super T, ? extends R> mapper) —— stream.map(String::length)
Transform elements into another type
mapToInt(ToIntFunction<? super T> mapper) —— stream.mapToInt(String::length)
Transform elements to an IntStream
mapToLong(ToLongFunction<? super T> mapper) —— stream.mapToLong(String::length)
Transform elements to a LongStream
mapToDouble(ToDoubleFunction<? super T> mapper) —— stream.mapToDouble(String::length)
Transform elements to a DoubleStream
flatMap(Function<? super T, ? extends Stream<? extends R>> mapper) —— stream.flatMap(list -> list.stream())
Flatten nested streams
distinct() —— stream.distinct()
Remove duplicate elements
sorted() —— stream.sorted()
Sort elements naturally
sorted(Comparator<? super T> comparator) —— stream.sorted(Comparator.reverseOrder())
Sort elements using custom comparator
peek(Consumer<? super T> action) —— stream.peek(System.out::println)
Perform an action for debugging or side-effect
limit(long maxSize) —— stream.limit(5)
Keep only the first n elements
skip(long n) —— stream.skip(2)
Skip the first n elements
unordered() —— stream.unordered()
Allow non-deterministic order for optimization
forEach() → void / side-effect
Iterates through elements
collect() → Collection (List, Set, Map)
Materializes the stream into a collection
reduce() → Single value
Combines elements into one value
count() → long
Returns the number of elements
anyMatch(), allMatch(), noneMatch() → boolean
Checks conditions on elements
findFirst(), findAny() → Optional<T>
Returns an element wrapped in Optional
Input → Output: 1-to-1 transformation.
Result: A stream of the same “shape” (one element per input element).
Example: words to length of each word
List<String> words = List.of("hello", "world");
List<Integer> lengths = words.stream()
.map(String::length) // map each word to its length
.toList();
System.out.println(lengths); // [5, 5]
Each String → Integer (1-to-1 mapping).
Input → Output: 1-to-many transformation.
Result: A single, flat stream instead of a stream of streams.
Example: splitting words into letters:
List<String> words = List.of("hi", "ok");
List<String> letters = words.stream()
.flatMap(word -> Arrays.stream(word.split("")))
.toList();
System.out.println(letters); // [h, i, o, k]
Each String → stream of letters
flatMap() merges all these small streams into one flat stream.
💡 Tip: Whenever your map() function returns a stream itself, you probably want flatMap() instead of map().
map() → [ "hi", "ok" ] → [ ["h","i"], ["o","k"] ] (stream of streams)
flatMap() → [ "hi", "ok" ] → [ "h","i","o","k" ] (flattened)
Function<Item, Double> applyTax = item ->
item.category.equals("Electronics") ? item.price * 1.1 : item.price;
Function<Double, Double> applyDiscount = price ->
price > 100 ? price * 0.95 : price;
double total = items.stream()
.map(applyTax)
.map(applyDiscount)
.reduce(0.0, Double::sum);
System.out.println("Total: " + total);
Each transformation is pure and reusable (applyTax, applyDiscount)
Composable: we can chain transformations without changing state
No mutable variables
Easier to test each function independently
Ready for parallel execution
A parallel stream is a way to process collections (like List, Set) in parallel, using the Fork-Join framework internally, without you having to manually manage threads.
List<Integer> numbers = Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10);
// Sequential stream
int sumSequential = numbers.stream()
.mapToInt(Integer::intValue)
.sum();
// Parallel stream
int sumParallel = numbers.parallelStream()
.mapToInt(Integer::intValue)
.sum();
Splitting: The collection is divided into chunks (using Spliterator).
Forking: Each chunk is processed in a separate thread from the common ForkJoinPool.
Joining: Results from each thread are combined automatically.
Large datasets (arrays, lists with thousands or millions of elements)
CPU-intensive operations (math, transformations)
Independent operations (no shared mutable state)
The task is I/O-bound (e.g., network calls, file reads)
The dataset is small (overhead may outweigh benefits)
You’re modifying shared data structures (risk of race conditions)
List<String> words = Arrays.asList("java", "parallel", "streams", "fork", "join");
int totalLength = words.parallelStream()
.mapToInt(String::length) // map
.sum(); // reduce
System.out.println("Total length: " + totalLength);
Here, the mapping of strings to lengths happens in parallel threads, then results are summed.
Order-sensitive operations: .forEachOrdered() may be needed if order matters.
Shared mutable data: Avoid modifying external collections inside a parallel stream.
ForkJoinPool size: Parallel streams use the common pool, which might interfere with other parallel tasks if overused.