Stream operations are divided into intermediate(Stream-producing)operations and terminal(value- or side-effect-producing)operations. Intermediate operations are always lazy.
1、Streamscan be obtained in a number of ways.
· From a Collection via the stream() and parallelStream()methods;
· From an array via Arrays.stream(Object[]);
· From static factory methods on the streamclasses, such as Stream.of(Object[]), IntStream.range(int, int) orStream.iterate(Object, UnaryOperator);
· The lines of a file can be obtained from BufferedReader.lines();
· Streams of file paths can be obtained frommethods in Files;
· Streams of random numbers can be obtainedfrom Random.ints();
· Numerous other stream-bearing methods inthe JDK, including BitSet.stream(),Pattern.splitAsStream(java.lang.CharSequence), and JarFile.stream().
1、intermediate operations
Intermediate operationsreturn a new stream. They are alwayslazy.
Intermediate operations arefurther divided into stateless(suchas filter
and map---
retain no state from previously seenelement when processing a new element) andstateful operations(such asdistinct
andsorted---
).
Traversal of the pipeline source does not begin untilthe terminal operation of the pipeline is executed.
2、terminal operations
After the terminal operationis performed, the stream pipeline is considered consumed, and can no longer beused.
In almost all cases,terminal operations are eager, completing their traversal of the datasource and processing of the pipeline before returning. Only the terminaloperations iterator() and spliterator() are not.(eg:Stream.forEachor IntStream.sum
)
About shortcuiting:
An intermediateoperation is short-circuiting if, when presented with infinite input, it mayproduce a finite stream as a result. A terminal operation is short-circuitingif, when presented with infinite input, it may terminate in finite time. Havinga short-circuiting operation in the pipeline is a necessary, but notsufficient, condition for the processing of an infinite stream to terminatenormally in finite time.
链接:https://stackoverflow.com/questions/1073909/side-effect-whats-this?noredirect=1
A side effect is anything a method does besidescomputing and returning a value. Any change of instance or class field valuesis a side effect, as is drawing something on the screen, writing to a file or anetwork connection.
Strictly speaking, a "function" is definedas not having side effects - which is why Java uses the word "method"instead. A real function with no return value would be pointless.
Obviously, a method that does not have a return valuemust have some sort of side effect that justifies its existence. Set methodsare an example - the side effect is changing the object's internal state
For Example:
A side effect is when a method call changes a class'sstate. So
public class SideEffectClass{
private intstate = 0;
publicdoSomething(int arg0){
state +=arg0;
}
}
Here, doSomething(int arg0) has the side effect ofchanging the state variable.
When you think of a program, you can think of it as instructions + state + input. (指令+状态+输入)So if the domain of a program is the range of allpossible input * state, and the program has side effects, you can see that the codomain(值域) of possible results for the applicationcan grow explosively, as the number of side effects increase. This makes thepossible states for the program large, which leads to complicated testing. Thefunctional programming paradigm is designed to eliminate side effects. Bymaking functions first class citizens and by making all declarations immutablefunctional programming prevents side effects, which makes functionalprogramming shine in parallel processing, as synchronization issues arereduced.
Example from api:
As anexample of how to transform a stream pipeline that inappropriately usesside-effects to one that does not, the following code searches a stream ofstrings for those matching a given regular expression, and puts the matches ina list.
ArrayList
stream.filter(s -> pattern.matcher(s).matches())
.forEach(s-> results.add(s)); // Unnecessaryuse of side-effects!
This code unnecessarily uses side-effects. If executedin parallel, the non-thread-safety of ArrayList would cause incorrect results,and adding needed synchronization would cause contention, undermining thebenefit of parallelism. Furthermore, using side-effects here is completelyunnecessary; the forEach() can simply be replaced with a reduction operationthat is safer, more efficient, and more amenable to parallelization:
List
stream.filter(s -> pattern.matcher(s).matches())
.collect(Collectors.toList()); // No side-effects!
Certain stream sources (such as List orarrays) are intrinsically ordered, whereas others (such as HashSet) are not.Some intermediate operations, such as sorted(), may impose an encounter orderon an otherwise unordered stream, and others may render an ordered streamunordered, such as BaseStream.unordered(). Further, some terminal operations may ignoreencounter order, such as forEach().
If a stream is ordered, most operations areconstrained to operate on the elements in their encounter order; if the sourceof a stream is a List containing [1, 2, 3], then the result of executing map(x-> x*2) must be [2, 4, 6]. However, if the source has no defined encounterorder, then any permutation(序列) of the values [2, 4, 6] would be a valid result.
A reduction operation (also called afold) takes a sequence of input elements and combines them into a singlesummary result by repeated application of a combining operation, such asfinding the sum or maximum of a set of numbers, or accumulating elements into alist. The streams classes have multiple forms of general reduction operations,calledreduce() andcollect(), aswell as multiple specialized reduction forms such assum(),max(), orcount().
(1)未使用reduce:
int sum = 0;
for (int x : numbers) {
sum += x;
}
(2)使用reduce
int sum = numbers.stream().reduce(0, (x,y) -> x+y);
or:
int sum = numbers.stream().reduce(0, Integer::sum);
(3)a reduceoperation on elements of type
U reduce(U identity,BiFunction accumulator,
BinaryOperator combiner);
identity element: both an initial seed value for the reduction and a default result if there are no input elements.
accumulator function: takes a partial result and the next element, and produces a new partial result.
combiner function :combines two partial results to produce a new partial result.
A mutable reduction operation accumulates input elements into a mutable result container, such as a Collection or StringBuilder, as it processes the elements in the stream.
ordinary reduction:
String concatenated = strings.reduce("", String::concat)
mutable reduction operation:collect(),it collects together the desired results into a result container such as a Collection.
A collect operation requires three functions:
a supplier function :to construct new instances of the result container, an accumulator function :to incorporate an input element into a result container,
a combining function :to merge the contents of one result container into another. The form of this is very similar to the general form of ordinary reduction:
R collect(Supplier supplier,
BiConsumer accumulator,
BiConsumer combiner);
Old:
ArrayList strings = new ArrayList<>();
for (T element : stream) {
strings.add(element.toString());
}
New:
ArrayList
(c,e) -> c.add(e.toString()),
(c1,c2) -> c1.addAll(c2));
or, pulling the mapping operation out of the accumulatorfunction, we could express it more succinctly as:
List
.collect(ArrayList::new, ArrayList::add, ArrayList::addAll);
Here, our supplier is just the ArrayList constructor
, the accumulator adds the stringifiedelement to anArrayList
, and the combiner simply usesaddAll
to copy the strings from one containerinto the other.