How the Streams reduce method works in Java

How the Streams reduce method works in Java

The reduce method is a method found within the stream API. A stream is a conduit for data. Thus, a stream represents a sequence of objects. A stream operates on a data source, such as an array or a collection. A stream, itself, never provides storage for the data. It simply moves data, possibly filtering, sorting, or otherwise operating on that data in the process. As a general rule, however, a stream operation by itself does not modify the data source. For example, sorting a stream does not change the order of the source. Rather, sorting a stream results in the creation of a new stream that produces the sorted result.

reduce()

The reduce() method combines a stream into a single object. It is a reduction, which means it processes all elements. Stream defines three versions of reduce( ).

Optional<T> reduce(BinaryOperator<T> accumulator)

The first form returns an object of type Optional, which contains the result. The second form returns an object of type T (which is the element type of the stream). In both forms, accumulator is a function that operates on two values and produces a result.

T reduce(T identityVal, BinaryOperator<T> accumulator)

In the second form, identityVal is a value such that an accumulator operation involving identityVal and any element of the stream yields that element, unchanged.

<U> U reduce(U identity,BiFunction<U,? super T,U> accumulator,BinaryOperator<U> combiner)

The third method signature is used when we are dealing with different types. It allows Java to create intermediate reductions and then combine them at the end. The three-argument reduce() operation is useful when working with parallel streams because it allows the stream to be decomposed and reassembled by separate threads.

It is important to understand that the accumulator operation must satisfy three constraints.

  • Stateless means that the operation does not rely on any state information. Thus, each element is processed independently.
  • Non-interfering means that the data source is not modified by the operation.
  • Finally, the operation must be associative. Here, the term associative is used in its normal, arithmetic sense, which means that, given an associative operator used in a sequence of operations, it does not matter which pair of operands are processed first. Associativity is of particular importance to the use of reduction operations on parallel streams, discussed in the next section.

Sample Scenario

import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;

public class BlogDemo {
    public static void main(String[] args) {
        Map<String, Integer> scores = new HashMap<>();
        scores.put("Data Logic", 87);
        scores.put("Internet Programing", 96);
        scores.put("Software Engineering", 67);
        scores.put("Data Structures", 89);
        scores.put("Machine Learning", 66);

       int total_score = scores
                            .entrySet()
                            .stream()
                            .map(Map.Entry::getValue)
                            .reduce(0, Integer::sum);

        System.out.println(total_score);

    }
}

In this situation, we’ve been given a list of subjects and their scores, and we’ve been asked to return the totals. I choose to work with the map data structure because it is the most complicated yet produces outstanding reduction results.

Thank you for taking time to read my article, I hope it provides some useful insights on Java Streams.

×