SlideShare une entreprise Scribd logo
1  sur  64
Step-By-Step Introduction to
Apache Flink
[Setup, Configure, Run, Tools]
Slim Baltagi @SlimBaltagi
Washington DC Area Apache Flink Meetup
November 19th
, 2015
2
For an overview of Apache Flink, see slides at http://www.slideshare.net/sbaltagi
Gelly
Table
ML
SAMOA
DataSet (Java/Scala/Python)
Batch Processing
DataStream (Java/Scala)
Stream Processing
HadoopM/R
Local
Single JVM
Embedded
Docker
Cluster
Standalone
YARN, Tez,
Mesos (WIP)
Cloud
Google’s GCE
Amazon’s EC2
IBM Docker Cloud, …
GoogleDataflow
Dataflow(WiP)
MRQL
Table
Cascading
Runtime
Distributed Streaming
Dataflow
Zeppelin
DEPLOYSYSTEMAPIs&LIBRARIESSTORAGE
Files
Local
HDFS
S3
Tachyon
Databases
MongoDB
HBase
SQL
…
Streams
Flume
Kafka
RabbitMQ
…
Batch Optimizer Stream Builder
3
Agenda
1. How to setup and configure your
Apache Flink environment?
2. How to use Apache Flink tools?
3. How to run the examples in the Apache
Flink bundle?
4. How to set up your IDE (IntelliJ IDEA or
Eclipse) for Apache Flink?
5. How to write your Apache Flink program
in an IDE?
4
1. How to setup and configure your Apache
Flink environment?
1.1 Local (on a single machine)
1.2 Flink in a VM image (on a single machine)
1.3 Flink on Docker
1.4 Standalone Cluster
1.5 Flink on a YARN Cluster
1.6 Flink on the Cloud
5
1.1 Local (on a single machine)
Flink runs on Linux, OS X and Windows. In order to
execute a program on a running Flink instance (and not
from within your IDE) you need to install Flink on your
machine.
The following steps will be detailed for both Unix-Like
(Linux, Mac OS X) as well as Windows environments:
1.1.1 Verify requirements
1.1.2 Download the Flink binary package
1.1.3 Unpack the downloaded archive
1.1.4 Configure
1.1.5 Start a local Flink instance
1.1.6 Validate Flink is running
1.1.7 Run a Flink example
1.1.8 Stop the local Flink instance
6
1.1 Local (on a single machine)
1.1.1 Verify requirements
The machine that Flink will run on, must have Java
1.7.x or higher installed. To check the Java version
installed issue the command: java –version. The out of
the box configuration will use your default Java
installation.
Optional: If you want to manually override the Java
runtime to use, set the JAVA_HOME environment variable
in Unix-like environment. To check if JAVA_HOME is set,
issue the command: echo $JAVA_HOME.
If needed, follow the instructions for installing Java and
setting JAVA_HOME on a Unix system here: https
://docs.oracle.com/cd/E19182-01/820-7851/inst_set_jdk_korn_bash_t/
index.html
7
1.1 Local (on a single machine)
In Windows environment, check the correct
installation of Java by issuing the following
command: java –version.
The bin folder of your Java Runtime Environment
must be included in Window’s %PATH% variable. If
needed, follow this guide to add Java to the path
variable. http://www.java.com/en/download/help/path.xml
If needed, follow the instructions for installing Java
and setting JAVA_HOME on a Windows system here:
https://docs.oracle.com/cd/E19182-01/820-7851/inst_set_jdk_windows_t/ind
8
1.1 Local (on a single machine)
1.1.2 Download the Flink binary package
The latest stable release of Apache Flink can be
downloaded from http://flink.apache.org/downloads.html
For example: In Linux-Like environment, run the
following command:
wget https://www.apache.org/dist/flink/flink-0.10.0/flink-
0.10.0-bin-hadoop1-scala_2.10.tgz
Which version to pick?
• You don’t have to install Hadoop to use Flink.
• But if you plan to use Flink with data stored in
Hadoop, pick the version matching your installed
Hadoop version.
• If you don’t want to do this, pick the Hadoop 1
version.
9
1.1 Local (on a single machine)
1.1.3 Unpack the downloaded .tgz archive
Example:
$ cd ~/Downloads # Go to download directory
$ tar -xvzf flink-*.tgz # Unpack the downloaded archive
$ cd flink-0.10.0
$ ls –l
10
1.1 Local (on a single machine)
1.1.4. Configure
The resulting folder contains a Flink setup that can
be locally executed without any further
configuration.
flink-conf.yaml under flink-0.10.0/conf contains the
default configuration parameters that allow Flink to
run out-of-the-box in single node setups.
11
1.1 Local (on a single machine)
1.1.5. Start a local Flink instance:
• Given that you have a local Flink installation, you
can start a Flink instance that runs a master and a
worker process on your local machine in a single
JVM.
• This execution mode is useful for local testing.
• On UNIX-Like system you can start a Flink instance
as follows:
 cd /to/your/flink/installation
 ./bin/start-local.sh
12
1.1 Local (on a single machine)
1.1.5. Start a local Flink instance:
On Windows you can either start with:
• Windows Batch Files by running the following
commands
 cd C:toyourflinkinstallation
 .binstart-local.bat
• or with Cygwin and Unix Scripts: start the Cygwin
terminal, navigate to your Flink directory and run
the start-local.sh script
 $ cd /cydrive/c
 cd flink
 $ bin/start-local.sh
13
1.1 Local (on a single machine)
The JobManager (the master of the distributed system)
automatically starts a web interface to observe program
execution. It runs on port 8081 by default (configured
in conf/flink-config.yml). http://localhost:8081/
1.1.6 Validate that Flink is running
You can validate that a local Flink instance is running
by:
• Issuing the following command: $ jps
jps: java virtual machine process status tool
• Looking at the log files in ./log/
$ tail log/flink-*-jobmanager-*.log
• Opening the JobManager’s web interface at
$ open http://localhost:8081
14
1.1 Local (on a single machine)
1.1.7 Run a Flink example
• On UNIX-Like system you can run a Flink example as follows:
 cd /to/your/flink/installation
 ./bin/flink run ./examples/WordCount.jar
• On Windows Batch Files, open a second terminal and run the
following commands”
 cd C:toyourflinkinstallation
 .binflink.bat run .examplesWordCount.jar
1.1.8 Stop local Flink instance
•On UNIX you call ./bin/stop-local.sh
•On Windows you quit the running process with Ctrl+C
15
1.2 VM image (on a single machine)
16
1.2 VM image (on a single machine)
Please send me an email to sbaltagi@gmail.com for a
link from which you can download a Flink Virtual
Machine.
The Flink VM, which is approximately 4 GB, is from
data Artisans http://data-artisans.com/
It currently has Flink 0.10.0, Kafka, IDEs (IntelliJ,
Eclipse), Firefox, …
 It will contain soon the FREE training from data
Artisans for Flink 0.10.0
Meanwhile, an older version, based on 0.9.1, of this
FREE training is is available from http
://dataartisans.github.io/flink-training/
17
1.3 Docker
Docker can be used for local development.
Container based virtualization advantages:
lightweight and portable;
build once
run anywhere
ease of packaging applications
automated and scripted
isolated
Often resource requirements on data
processing clusters exhibit high variation.
Elastic deployments reduce TCO (Total Cost
of Ownership).
18
1.3 Flink on Docker
 Apache Flink cluster deployment on Docker using
Docker-Compose by Simons Laws from IBM. Talk at
the Flink Forward in Berlin on October 12, 2015.
 Slides:
http://www.slideshare.net/FlinkForward/simon-laws-apache-flink-cluster-d
dockercompose
 Video recording (40’:49): https://www.youtube.com/watch?v=CaObaAv9tLE
The talk:
• Introduces the basic concepts of container isolation
exemplified on Docker
• Explain how Apache Flink is made elastic using
Docker-Compose.
• Show how to push the cluster to the cloud
exemplified on the IBM Docker Cloud.
19
1.3 Flink on Docker
 Apache Flink dockerized: This is a set of scripts to
create a local multi-node Flink cluster, each node
inside a docker container.
https://hub.docker.com/r/gustavonalle/flink/
Using docker to setup a development environment
that is reproducible. Apache Flink cluster deployment
on Docker using Docker-Compose
https://github.com/apache/flink/tree/master/flink-contrib/d
Web resources to learn more about Docker
http://www.flinkbigdata.com/component/tags/tag/47-dock
20
1.4 Standalone Cluster
 See quick start - Cluster
https://ci.apache.org/projects/flink/flink-docs-rele
setup
See instructions on how to run Flink in a fully
distributed fashion on a static (possibly
heterogeneous) cluster. This involves two
steps:
• Installing and configuring Flink
• Installing and configuring the Hadoop
Distributed File System (HDFS)
https://ci.apache.org/projects/flink/flink-docs-master/setup/cluster_setup.ht
21
1.5 Flink on a YARN Cluster
You can easily deploy Flink on your
existing YARN cluster:
1. Download the Flink Hadoop2 package: Flink with Hadoop 2
2. Make sure your HADOOP_HOME (or YARN_CONF_DIR or
HADOOP_CONF_DIR) environment variable is set to read your
YARN and HDFS configuration.
– Run the YARN client with: ./bin/yarn-session.sh You can run
the client with options -n 10 -tm 8192 to allocate 10
TaskManagers with 8GB of memory each.
For more detailed instructions, please check out the
documentation:
https://ci.apache.org/projects/flink/flink-docs-master/setu
yarn_setup.html
22
1.6 Flink on the Cloud
1.6.1 Google Compute Engine (GCE)
1.6.2 Amazon EMR
23
1.6 Cloud
1.6.1 Google Compute Engine
Free trial for Google Cloud Engine:
https://cloud.google.com/free-trial/
Enjoy your $300 in GCE for 60 days!
Now, how to setup Flink with Hadoop 1 or
Hadoop 2 on top of a Google Compute Engine
cluster?
Google’s bdutil starts a cluster and deploys
Flink with Hadoop. To get started, just follow the
steps here: https
://ci.apache.org/projects/flink/flink-docs-master/se
24
1.6 Cloud
1.6.2 Amazon EMR
Amazon Elastic MapReduce (Amazon EMR) is a web
service providing a managed Hadoop framework.
• http://aws.amazon.com/elasticmapreduce/
• http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-what-is
emr.html
Example: Use Stratosphere with Amazon Elastic
MapReduce, February 18, 2014 by Robert Metzger
https://flink.apache.org/news/2014/02/18/amazon-elastic-mapreduce-cloud-yarn.html
Use pre-defined cluster definition to deploy Apache
Flink using Karamel web app http://www.karamel.io/
Getting Started – Installing Apache Flink on Amazon
EC2 by Kamel Hakimzadeh. Published on October 12,
2015 https://www.youtube.com/watch?v=tCIA8_2dR14
25
Agenda
1. How to setup and configure your Apache
Flink environment?
2. How to use Apache Flink tools?
3. How to run the examples in the Apache
Flink bundle?
4. How to set up your IDE (IntelliJ IDEA or
Eclipse) for Apache Flink?
5. How to write your Apache Flink program
in an IDE?
26
2. How to use Apache Flink tools?
2.1 Command-Line Interface (CLI)
2.2 Web Submission Client
2.3 Job Manager Web Interface
2.4 Interactive Scala Shell
2.5 Apache Zeppelin Notebook
27
2.1 Command-Line Interface (CLI)
 Flink provides a CLI to run programs that are
packaged as JAR files, and control their execution.
 bin/flink has 4 major actions
• run #runs a program.
• info #displays information about a program.
• list #lists scheduled and running jobs
• cancel #cancels a running job.
 Example: ./bin/flink info ./examples/KMeans.jar
 See CLI usage and related examples:
https://ci.apache.org/projects/flink/flink-docs-master/apis/cli.html
28
2.2 Web Submission Client
29
2.2 Web Submission Client
Flink provides a web interface to:
• Upload programs
• Execute programs
• Inspect their execution plans
• Showcase programs
• Debug execution plans
• Demonstrate the system as a whole
The web interface runs on port 8080 by default.
To specify a custom port set the webclient.port
property in the ./conf/flink.yaml configuration file.
30
2.2 Web Submission Client
Start the web interface by executing:
./bin/start-webclient.sh
Stop the web interface by executing:
./bin/stop-webclient.sh
• Jobs are submitted to the JobManager
specified
by jobmanager.rpc.address and jobmanager.rpc.port
• For more details and further configuration
options, please consult this webpage:
https://ci.apache.org/projects/flink/flink-docs-release-0.10/setup/config.html#
webclient
31
2.3 Web Submission Client
 The JobManager (the master of the
distributed system) starts a web interface
to observe program execution.
 It runs on port 8081 by default (configured
in conf/flink-config.yml).
 Open the JobManager’s web interface
at
http://localhost:8081
• jobmanager.rpc.port 6123
• jobmanager.web.port 8081
32
2.3 Job Manager Web Interface
Overall system status
Job execution details
Task Manager resource
utilization
33
2.3 Job Manager Web Interface
The JobManager web frontend allows to :
Track the progress of a Flink program as 
all status changes are also logged to the 
JobManager’s log file.
Figure out why a program failed as it 
displays the exceptions of failed tasks 
and allows to figure out which parallel 
task first failed and caused the other tasks 
to cancel the execution.
34
2.4   Interactive Scala Shell
.bin/start-scala-shell.sh local
35
2.4   Interactive Scala Shell
 Flink comes with an Interactive Scala Shell - REPL 
( Read Evaluate Print Loop ).
 It can be used in a local setup as well as in a cluster 
setup. 
 ./bin/start-scala-shell.sh local
bin/start-scala-shell.sh remote <hostname> <portnumber>
Interactive queries
Let’s you explore data quickly
Complete Scala API available
The Flink Shell comes with command history and auto 
completion.
So far only batch mode is supported. There is plan to add 
streaming in the future.
https://ci.apache.org/projects/flink/flink-docs-release-0.10/apis/scala_shell.html
36
2.4   Interactive Scala Shell
Usage: 
• Example 1: 
Scala-Flink> val input = env.fromElements(1,2,3,4)
Scala-Flink> val doubleInput = input.map(_ *2)
Scala-Flink> doubleInput.print()
• Example 2: 
Scala-Flink> val text = env.fromElements("To be, or not to be") 
Scala-Flink> val counts = text.flatMap 
{ _.toLowerCase.split("W+") }.
map { (_, 1) }.
groupBy(0).
sum(1)
Scala-Flink> counts.print()
37
2.4   Interactive Scala Shell
Problems with the Interactive Scala Shell:
No visualization
No saving  
No replaying of written code
No assistance as in an IDE
38
2.5   Apache Zeppelin Notebook
39
2.5   Apache Zeppelin Notebook
Web-based interactive computation 
environment 
Combines rich text, execution code, plots 
and rich media 
Exploratory data science
Storytelling 
40
2.5   Apache Zeppelin Notebook
Resources:
Step by Step tutorial for setting up Apache Zeppelin 
while pointing to a full blown cluster. Trevor Grant, 
November 3, 2015
http://trevorgrant.org/2015/11/03/apache-casserole-a-delicious-big-data-recipe-for-the-wh
/ 
‘Data Science Lifecycle with Apache Flink and Apache 
Zeppelin’. Moon Soo Lee, Flink Forward 2015, Berlin, Germany. 
October 12, 2015
Slides: http
://www.slideshare.net/FlinkForward/moon-soo-lee-data-science-lifecy
zeppelin
Video recording: https://www.youtube.com/watch?v=
icyOTyteMqs
Interactive Data Analysis with Apache Flink. Till 
Rohrmann, June 24, 2015
41
Agenda
1. How  to setup and configure your 
Apache Flink environment?
2. How to use Apache Flink tools?
3.  How  to run the  examples  in  the 
Apache Flink bundle?
4.  How  to set  up your  IDE  (IntelliJ  IDEA 
or Eclipse) for Apache Flink?
5. How to write your Apache 
Flink program in an IDE?
42
3. How to run the examples in the Apache 
Flink bundle?
3.1.1 Where are the examples?
3.1.2 Where are the related source 
codes?
3.1.3 How to re-build these examples?
3.1.4 How to run these examples?
43
3.1 How to run the examples in the Apache 
Flink bundle?
3.1.1    Where are the examples?
44
3.1 How to run the examples in the Apache 
Flink bundle?
The examples provided in the Flink bundle 
showcase different applications of Flink from 
simple word counting to graph algorithms.
They illustrate the use of Flink’s API.
 They are a very good way to learn how to 
write Flink jobs. 
A good starting point would be to modify 
them!
Now, where are the related source codes!?
45
3.1 How to run the examples in the Apache 
Flink bundle?
3.1.2    Where are the related source codes?
You can find the source code of these 
Flink examples in the flink-examples and 
the flink-streaming-examples modules of the 
source release of Flink. Example: https
://www.apache.org/dist/flink/flink-0.10.0-src.tgz 
You can also access the source (and 
hence the examples) through GitHub: 
https://github.com/apache/flink/tree/master/flink-examples
46
3.1 How to run the examples in the Apache 
Flink bundle?
3.1.2    Where are the related source codes?
If you don't want to import the whole Flink project 
just for playing around with the examples, you can:
• Create an empty maven project. This script will 
automatically set everything up for you: $ curl 
http://flink.apache.org/q/quickstart.sh | bash
• Import the "quickstart" project into Eclipse or 
IntelliJ. It will download all dependencies and 
package everything correctly. 
• If you want to use an example there, just copy the 
Java file into the "quickstart" project. 
47
3.1 How to run the examples in the Apache 
Flink bundle?
3.1.3    How to re-build these examples?
To build the examples, you can run: 
   "mvn clean package -DskipTests”
    in the "flink-examples/flink-java-examples" 
directory. 
This will re-build them. 
48
3.1 How to run the examples in the Apache 
Flink bundle?
3.1.4 How to run these examples?
How to display the command line arguments?
 ./bin/flink info ./examples/WordCount.jar
Example of running an example: ./bin/flink run 
./examples/KMeans.jar
Source code: 
https://github.com/apache/flink/blob/master/flink-examples/flink-java-example
KMeans.java
More on the bundled examples: 
https://ci.apache.org/projects/flink/flink-docs-master/apis/examples.html#running-an-
example
49
Run K-Means example
1. Generate Input Data
Flink contains a data generator for K-Means that has the
following arguments (arguments in [] are optional):
-points <num> -k <num clusters> [-output <output-path>] [-
stddev <relative stddev>] [-range <centroid range>] [-seed
<seed>]
Go to the Flink root installation:
$ cd flink-0.10.0
Create a new directory that will contains the data:
$ mkdir kmeans
$ cd kmeans
50
Run K-Means example
Create some data using Flink's tool:
java -cp ../examples/KMeans.jar:../lib/flink-dist-0.10.0.jar
org.apache.flink.examples.java.clustering.util.KMeansD
ataGenerator -points 500 -k 10 -stddev 0.08 -output
`pwd`
The directory should now contain the files "centers"
and "points".
51
Run K-Means example
Continue following the instructions on Quick
Start: Run K-Means Example as outlined here:
https://ci.apache.org/projects/flink/flink-docs-releas
Happy Flinking!
52
Agenda
1. How to setup and configure your
Apache Flink environment?
2. How to use Apache Flink tools?
3. How to run the examples in the Apache
Flink bundle?
4. How to set up your IDE (IntelliJ IDEA or
Eclipse) for Apache Flink?
5. How to write your Apache Flink program
in an IDE?
53
4. How to set up your IDE (IntelliJ IDEA or
Eclipse) for Apache Flink?
4.1 How to set up your IDE (IntelliJ IDEA)?
4.2 How to setup your IDE (Eclipse)?
Flink uses mixed Scala/Java projects, which
pose a challenge to some IDEs
Minimal requirements for an IDE are:
• Support for Java and Scala (also mixed projects)
• Support for Maven with Java and Scala
54
4.1 How to set up your IDE (IntelliJ
IDEA)?IntelliJ IDEA supports Maven out of the box
and offers a plugin for Scala development.
IntelliJ IDEA Download https
://www.jetbrains.com/idea/download/
IntelliJ Scala Plugin
http://plugins.jetbrains.com/plugin/?id=1347
Check out Setting up IntelliJ IDEA guide for
details
https://github.com/apache/flink/blob/master/docs/internals/ide_setup.md#intel
Screencast: Run Apache Flink WordCount
from IntelliJ
https://www.youtube.com/watch?v=JIV_rX-
OIQM
55
4.2 How to setup your IDE (Eclipse)?
• For Eclipse users, Apache Flink committers
recommend using Scala IDE 3.0.3, based on
Eclipse Kepler.
• While this is a slightly older version, they
found it to be the version that works most
robustly for a complex project like Flink. One
restriction is, though, that it works only with
Java 7, not with Java 8.
• Check out how to setup Eclipse docs:
https://github.com/apache/flink/blob/master/docs/internals/ide_setup.md
#eclipse
56
Agenda
1. How to setup and configure your
Apache Flink environment?
2. How to use Apache Flink tools?
3. How to run the examples in the Apache
Flink bundle?
4. How to set up your IDE (IntelliJ IDEA or
Eclipse) for Apache Flink?
5. How to write your Apache Flink program
in an IDE?
57
5. How to write your Apache
Flink program in an IDE?
5.1 How to write a Flink program in an IDE?
5.2 How to generate a Flink project with
Maven?
5.3 How to import the Flink Maven project
into IDE
5.4 How to use logging?
5.5 FAQs and best practices related to coding
58
5.1 How to write a Flink program in an
IDE?
The easiest way to get a working setup to
develop (and locally execute) Flink programs
is to follow the Quick Start guide:
https://ci.apache.org/projects/flink/flink-docs-master/quickstart/java_api_quickstart.html
https://ci.apache.org/projects/flink/flink-docs-master/quickstart/scala_api_quickstart.html
It uses Maven archetype to configure and
generate a Flink Maven project.
This will save you time dealing with transitive
dependencies!
This Maven project can be imported into your
IDE.
59
5.2. How to generate a skeleton Flink
project with Maven?
Generate a skeleton flink-quickstart-java Maven project
to get started with no need to manually download any
.tgz or .jar files
Option 1: $ curl http://flink.apache.org/q/quickstart.sh |
bash
A sample quickstart Flink Job will be created:
• Switch into the directory using: cd quickstart
• Import the project there using your favorite IDE
(Import it as a maven project)
• Build a jar inside the directory using: mvn clean
package
• You will find the runnable jar in quickstart/target
60
5.2. How to generate a skeleton Flink
project with Maven?
Option 2: Type the command below to create a flink-
quickstart-java or flink-quickstart-scala project and
specify Flink version
mvn archetype:generate /
-DarchetypeGroupId=org.apache.flink /
-DarchetypeArtifactId=flink-quickstart-java /
-DarchetypeVersion=0.10.0
you can also put “quickstart-
scala” here
you can also put “quickstart-
scala” here
or “0.1.0-SNAPSHOT”or “0.1.0-SNAPSHOT”
61
5.2. How to generate a skeleton Flink
project with Maven?
The generated projects are located in a folder
called flink-java-project or flink-scala-project.
In order to test the generated projects and to download
all required dependencies run the following commands
(change flink-java-project to flink-scala-project for
Scala projects)
• cd flink-java-project
• mvn clean package
Maven will now start to download all required
dependencies and build the Flink quickstart project.
62
5.3 How to import the Flink Maven project
into IDE
The generated Maven project needs to be imported into
your IDE:
IntelliJ:
• Select “File” -> “Import Project”
• Select root folder of your project
• Select “Import project from external model”,
select “Maven”
• Leave default options and finish the import
Eclipse:
• Select “File” -> “Import” -> “Maven” -> “Existing Maven
Project”
• Follow the import instructions
63
5.4 How to use logging?
The logging in Flink is implemented using the slf4j
logging interface. log4j is used as underlying logging
framework.
Log4j is controlled using property file usually
called log4j.properties. You can pass to the JVM the
filename and location of this file using
the Dlog4j.configuration= parameter.
The loggers using slf4j are created by calling
import org.slf4j.LoggerFactory
import org.slf4j.Logger
Logger LOG = LoggerFactory.getLogger(Foobar.class)
You can also use logback instead of log4j.
https://ci.apache.org/projects/flink/flink-docs-release-0.9/internals/logging.html
64
5.5 FAQs & best practices related to
coding
Errors http://flink.apache.org/faq.html#errors
Usage http://flink.apache.org/faq.html#usage
Flink APIs Best Practices
https://ci.apache.org/projects/flink/flink-docs-
master/apis/best_practices.html
Thanks!

Contenu connexe

Tendances

Apache Flink @ NYC Flink Meetup
Apache Flink @ NYC Flink MeetupApache Flink @ NYC Flink Meetup
Apache Flink @ NYC Flink MeetupStephan Ewen
 
Flink Forward Berlin 2017: Patrick Lucas - Flink in Containerland
Flink Forward Berlin 2017: Patrick Lucas - Flink in ContainerlandFlink Forward Berlin 2017: Patrick Lucas - Flink in Containerland
Flink Forward Berlin 2017: Patrick Lucas - Flink in ContainerlandFlink Forward
 
Flexible and Real-Time Stream Processing with Apache Flink
Flexible and Real-Time Stream Processing with Apache FlinkFlexible and Real-Time Stream Processing with Apache Flink
Flexible and Real-Time Stream Processing with Apache FlinkDataWorks Summit
 
Apache Flink internals
Apache Flink internalsApache Flink internals
Apache Flink internalsKostas Tzoumas
 
Producer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache KafkaProducer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache KafkaJiangjie Qin
 
Windows Privilege Escalation
Windows Privilege EscalationWindows Privilege Escalation
Windows Privilege EscalationRiyaz Walikar
 
Introduction to Apache Flink
Introduction to Apache FlinkIntroduction to Apache Flink
Introduction to Apache Flinkdatamantra
 
Docker Introduction
Docker IntroductionDocker Introduction
Docker IntroductionRobert Reiz
 
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision Tree
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision TreeApache Kafka vs RabbitMQ: Fit For Purpose / Decision Tree
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision TreeSlim Baltagi
 
Big Data Redis Mongodb Dynamodb Sharding
Big Data Redis Mongodb Dynamodb ShardingBig Data Redis Mongodb Dynamodb Sharding
Big Data Redis Mongodb Dynamodb ShardingAraf Karsh Hamid
 
Concurrency in Golang
Concurrency in GolangConcurrency in Golang
Concurrency in GolangOliver N
 
Hunting for Credentials Dumping in Windows Environment
Hunting for Credentials Dumping in Windows EnvironmentHunting for Credentials Dumping in Windows Environment
Hunting for Credentials Dumping in Windows EnvironmentTeymur Kheirkhabarov
 
Introducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorIntroducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorFlink Forward
 
Building a fully managed stream processing platform on Flink at scale for Lin...
Building a fully managed stream processing platform on Flink at scale for Lin...Building a fully managed stream processing platform on Flink at scale for Lin...
Building a fully managed stream processing platform on Flink at scale for Lin...Flink Forward
 
Hive + Tez: A Performance Deep Dive
Hive + Tez: A Performance Deep DiveHive + Tez: A Performance Deep Dive
Hive + Tez: A Performance Deep DiveDataWorks Summit
 
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...Databricks
 

Tendances (20)

Apache Flink @ NYC Flink Meetup
Apache Flink @ NYC Flink MeetupApache Flink @ NYC Flink Meetup
Apache Flink @ NYC Flink Meetup
 
Flink Forward Berlin 2017: Patrick Lucas - Flink in Containerland
Flink Forward Berlin 2017: Patrick Lucas - Flink in ContainerlandFlink Forward Berlin 2017: Patrick Lucas - Flink in Containerland
Flink Forward Berlin 2017: Patrick Lucas - Flink in Containerland
 
Apache flink
Apache flinkApache flink
Apache flink
 
Flexible and Real-Time Stream Processing with Apache Flink
Flexible and Real-Time Stream Processing with Apache FlinkFlexible and Real-Time Stream Processing with Apache Flink
Flexible and Real-Time Stream Processing with Apache Flink
 
Apache Flink internals
Apache Flink internalsApache Flink internals
Apache Flink internals
 
Producer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache KafkaProducer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache Kafka
 
Windows Privilege Escalation
Windows Privilege EscalationWindows Privilege Escalation
Windows Privilege Escalation
 
Introduction to Apache Flink
Introduction to Apache FlinkIntroduction to Apache Flink
Introduction to Apache Flink
 
Flink vs. Spark
Flink vs. SparkFlink vs. Spark
Flink vs. Spark
 
Flink Streaming
Flink StreamingFlink Streaming
Flink Streaming
 
Docker Introduction
Docker IntroductionDocker Introduction
Docker Introduction
 
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision Tree
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision TreeApache Kafka vs RabbitMQ: Fit For Purpose / Decision Tree
Apache Kafka vs RabbitMQ: Fit For Purpose / Decision Tree
 
Big Data Redis Mongodb Dynamodb Sharding
Big Data Redis Mongodb Dynamodb ShardingBig Data Redis Mongodb Dynamodb Sharding
Big Data Redis Mongodb Dynamodb Sharding
 
Concurrency in Golang
Concurrency in GolangConcurrency in Golang
Concurrency in Golang
 
Hunting for Credentials Dumping in Windows Environment
Hunting for Credentials Dumping in Windows EnvironmentHunting for Credentials Dumping in Windows Environment
Hunting for Credentials Dumping in Windows Environment
 
Introducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes OperatorIntroducing the Apache Flink Kubernetes Operator
Introducing the Apache Flink Kubernetes Operator
 
Building a fully managed stream processing platform on Flink at scale for Lin...
Building a fully managed stream processing platform on Flink at scale for Lin...Building a fully managed stream processing platform on Flink at scale for Lin...
Building a fully managed stream processing platform on Flink at scale for Lin...
 
Hive + Tez: A Performance Deep Dive
Hive + Tez: A Performance Deep DiveHive + Tez: A Performance Deep Dive
Hive + Tez: A Performance Deep Dive
 
YARN High Availability
YARN High AvailabilityYARN High Availability
YARN High Availability
 
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...
Redis + Structured Streaming—A Perfect Combination to Scale-Out Your Continuo...
 

En vedette

Apache Flink Crash Course by Slim Baltagi and Srini Palthepu
Apache Flink Crash Course by Slim Baltagi and Srini PalthepuApache Flink Crash Course by Slim Baltagi and Srini Palthepu
Apache Flink Crash Course by Slim Baltagi and Srini PalthepuSlim Baltagi
 
Testing distributed, complex web applications
Testing distributed, complex web applicationsTesting distributed, complex web applications
Testing distributed, complex web applicationsJens-Christian Fischer
 
Overview of Apache Flink: Next-Gen Big Data Analytics Framework
Overview of Apache Flink: Next-Gen Big Data Analytics FrameworkOverview of Apache Flink: Next-Gen Big Data Analytics Framework
Overview of Apache Flink: Next-Gen Big Data Analytics FrameworkSlim Baltagi
 
Testing data streaming applications
Testing data streaming applicationsTesting data streaming applications
Testing data streaming applicationsLars Albertsson
 
Test strategies for data processing pipelines, v2.0
Test strategies for data processing pipelines, v2.0Test strategies for data processing pipelines, v2.0
Test strategies for data processing pipelines, v2.0Lars Albertsson
 
Apache-Flink-What-How-Why-Who-Where-by-Slim-Baltagi
Apache-Flink-What-How-Why-Who-Where-by-Slim-BaltagiApache-Flink-What-How-Why-Who-Where-by-Slim-Baltagi
Apache-Flink-What-How-Why-Who-Where-by-Slim-BaltagiSlim Baltagi
 

En vedette (6)

Apache Flink Crash Course by Slim Baltagi and Srini Palthepu
Apache Flink Crash Course by Slim Baltagi and Srini PalthepuApache Flink Crash Course by Slim Baltagi and Srini Palthepu
Apache Flink Crash Course by Slim Baltagi and Srini Palthepu
 
Testing distributed, complex web applications
Testing distributed, complex web applicationsTesting distributed, complex web applications
Testing distributed, complex web applications
 
Overview of Apache Flink: Next-Gen Big Data Analytics Framework
Overview of Apache Flink: Next-Gen Big Data Analytics FrameworkOverview of Apache Flink: Next-Gen Big Data Analytics Framework
Overview of Apache Flink: Next-Gen Big Data Analytics Framework
 
Testing data streaming applications
Testing data streaming applicationsTesting data streaming applications
Testing data streaming applications
 
Test strategies for data processing pipelines, v2.0
Test strategies for data processing pipelines, v2.0Test strategies for data processing pipelines, v2.0
Test strategies for data processing pipelines, v2.0
 
Apache-Flink-What-How-Why-Who-Where-by-Slim-Baltagi
Apache-Flink-What-How-Why-Who-Where-by-Slim-BaltagiApache-Flink-What-How-Why-Who-Where-by-Slim-Baltagi
Apache-Flink-What-How-Why-Who-Where-by-Slim-Baltagi
 

Similaire à Step-by-Step Introduction to Apache Flink

Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...
Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...
Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...Yun Lung Li
 
Extracting twitter data using apache flume
Extracting twitter data using apache flumeExtracting twitter data using apache flume
Extracting twitter data using apache flumeBharat Khanna
 
BLCN532 Lab 1Set up your development environmentV2.0.docx
BLCN532 Lab 1Set up your development environmentV2.0.docxBLCN532 Lab 1Set up your development environmentV2.0.docx
BLCN532 Lab 1Set up your development environmentV2.0.docxmoirarandell
 
Using Docker to build and test in your laptop and Jenkins
Using Docker to build and test in your laptop and JenkinsUsing Docker to build and test in your laptop and Jenkins
Using Docker to build and test in your laptop and JenkinsMicael Gallego
 
codemotion-docker-2014
codemotion-docker-2014codemotion-docker-2014
codemotion-docker-2014Carlo Bonamico
 
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...linuxlab_conf
 
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...Marco Cavallini
 
Why everyone is excited about Docker (and you should too...) - Carlo Bonamic...
Why everyone is excited about Docker (and you should too...) -  Carlo Bonamic...Why everyone is excited about Docker (and you should too...) -  Carlo Bonamic...
Why everyone is excited about Docker (and you should too...) - Carlo Bonamic...Codemotion
 
Dockerizing a Symfony2 application
Dockerizing a Symfony2 applicationDockerizing a Symfony2 application
Dockerizing a Symfony2 applicationRoman Rodomansky
 
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)Ruoshi Ling
 
Server(less) Swift at SwiftCloudWorkshop 3
Server(less) Swift at SwiftCloudWorkshop 3Server(less) Swift at SwiftCloudWorkshop 3
Server(less) Swift at SwiftCloudWorkshop 3kognate
 
Dockerizing Symfony Applications - Symfony Live Berlin 2014
Dockerizing Symfony Applications - Symfony Live Berlin 2014Dockerizing Symfony Applications - Symfony Live Berlin 2014
Dockerizing Symfony Applications - Symfony Live Berlin 2014D
 
Agile Brown Bag - Vagrant & Docker: Introduction
Agile Brown Bag - Vagrant & Docker: IntroductionAgile Brown Bag - Vagrant & Docker: Introduction
Agile Brown Bag - Vagrant & Docker: IntroductionAgile Partner S.A.
 
Preparation study of_docker - (MOSG)
Preparation study of_docker  - (MOSG)Preparation study of_docker  - (MOSG)
Preparation study of_docker - (MOSG)Soshi Nemoto
 
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...Eric Smalling
 
Node.js, Vagrant, Chef, and Mathoid @ Benetech
Node.js, Vagrant, Chef, and Mathoid @ BenetechNode.js, Vagrant, Chef, and Mathoid @ Benetech
Node.js, Vagrant, Chef, and Mathoid @ BenetechChristopher Bumgardner
 
Jfrog artifactory as private docker registry
Jfrog artifactory as private docker registryJfrog artifactory as private docker registry
Jfrog artifactory as private docker registryVipin Mandale
 

Similaire à Step-by-Step Introduction to Apache Flink (20)

Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...
Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...
Apache flink-crash-course-by-slim-baltagi-and-srini-palthepu-150817191850-lva...
 
Extracting twitter data using apache flume
Extracting twitter data using apache flumeExtracting twitter data using apache flume
Extracting twitter data using apache flume
 
Phalcon - Giant Killer
Phalcon - Giant KillerPhalcon - Giant Killer
Phalcon - Giant Killer
 
Keep calm and vagrant up
Keep calm and vagrant upKeep calm and vagrant up
Keep calm and vagrant up
 
BLCN532 Lab 1Set up your development environmentV2.0.docx
BLCN532 Lab 1Set up your development environmentV2.0.docxBLCN532 Lab 1Set up your development environmentV2.0.docx
BLCN532 Lab 1Set up your development environmentV2.0.docx
 
Using Docker to build and test in your laptop and Jenkins
Using Docker to build and test in your laptop and JenkinsUsing Docker to build and test in your laptop and Jenkins
Using Docker to build and test in your laptop and Jenkins
 
codemotion-docker-2014
codemotion-docker-2014codemotion-docker-2014
codemotion-docker-2014
 
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...
Marco Cavallini - Yocto Project, an automatic generator of embedded Linux dis...
 
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...
Marco Cavallini @ LinuxLab 2018 : Workshop Yocto Project, an automatic genera...
 
Why everyone is excited about Docker (and you should too...) - Carlo Bonamic...
Why everyone is excited about Docker (and you should too...) -  Carlo Bonamic...Why everyone is excited about Docker (and you should too...) -  Carlo Bonamic...
Why everyone is excited about Docker (and you should too...) - Carlo Bonamic...
 
Dockerizing a Symfony2 application
Dockerizing a Symfony2 applicationDockerizing a Symfony2 application
Dockerizing a Symfony2 application
 
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)
當專案漸趕,當遷移也不再那麼難 (Ship Your Projects with Docker EcoSystem)
 
Server(less) Swift at SwiftCloudWorkshop 3
Server(less) Swift at SwiftCloudWorkshop 3Server(less) Swift at SwiftCloudWorkshop 3
Server(less) Swift at SwiftCloudWorkshop 3
 
Dockerizing Symfony Applications - Symfony Live Berlin 2014
Dockerizing Symfony Applications - Symfony Live Berlin 2014Dockerizing Symfony Applications - Symfony Live Berlin 2014
Dockerizing Symfony Applications - Symfony Live Berlin 2014
 
Agile Brown Bag - Vagrant & Docker: Introduction
Agile Brown Bag - Vagrant & Docker: IntroductionAgile Brown Bag - Vagrant & Docker: Introduction
Agile Brown Bag - Vagrant & Docker: Introduction
 
PHP Conference - Phalcon hands-on
PHP Conference - Phalcon hands-onPHP Conference - Phalcon hands-on
PHP Conference - Phalcon hands-on
 
Preparation study of_docker - (MOSG)
Preparation study of_docker  - (MOSG)Preparation study of_docker  - (MOSG)
Preparation study of_docker - (MOSG)
 
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...
IBM Index 2018 Conference Workshop: Modernizing Traditional Java App's with D...
 
Node.js, Vagrant, Chef, and Mathoid @ Benetech
Node.js, Vagrant, Chef, and Mathoid @ BenetechNode.js, Vagrant, Chef, and Mathoid @ Benetech
Node.js, Vagrant, Chef, and Mathoid @ Benetech
 
Jfrog artifactory as private docker registry
Jfrog artifactory as private docker registryJfrog artifactory as private docker registry
Jfrog artifactory as private docker registry
 

Plus de Slim Baltagi

How to select a modern data warehouse and get the most out of it?
How to select a modern data warehouse and get the most out of it?How to select a modern data warehouse and get the most out of it?
How to select a modern data warehouse and get the most out of it?Slim Baltagi
 
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-Baltagi
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-BaltagiModern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-Baltagi
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-BaltagiSlim Baltagi
 
Modern big data and machine learning in the era of cloud, docker and kubernetes
Modern big data and machine learning in the era of cloud, docker and kubernetesModern big data and machine learning in the era of cloud, docker and kubernetes
Modern big data and machine learning in the era of cloud, docker and kubernetesSlim Baltagi
 
Building Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaBuilding Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaSlim Baltagi
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsSlim Baltagi
 
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summit
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summitAnalysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summit
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summitSlim Baltagi
 
Apache Fink 1.0: A New Era for Real-World Streaming Analytics
Apache Fink 1.0: A New Era  for Real-World Streaming AnalyticsApache Fink 1.0: A New Era  for Real-World Streaming Analytics
Apache Fink 1.0: A New Era for Real-World Streaming AnalyticsSlim Baltagi
 
Overview of Apache Fink: The 4G of Big Data Analytics Frameworks
Overview of Apache Fink: The 4G of Big Data Analytics FrameworksOverview of Apache Fink: The 4G of Big Data Analytics Frameworks
Overview of Apache Fink: The 4G of Big Data Analytics FrameworksSlim Baltagi
 
Apache Flink: Real-World Use Cases for Streaming Analytics
Apache Flink: Real-World Use Cases for Streaming AnalyticsApache Flink: Real-World Use Cases for Streaming Analytics
Apache Flink: Real-World Use Cases for Streaming AnalyticsSlim Baltagi
 
Apache Flink community Update for March 2016 - Slim Baltagi
Apache Flink community Update for March 2016 - Slim BaltagiApache Flink community Update for March 2016 - Slim Baltagi
Apache Flink community Update for March 2016 - Slim BaltagiSlim Baltagi
 
Unified Batch and Real-Time Stream Processing Using Apache Flink
Unified Batch and Real-Time Stream Processing Using Apache FlinkUnified Batch and Real-Time Stream Processing Using Apache Flink
Unified Batch and Real-Time Stream Processing Using Apache FlinkSlim Baltagi
 
Why apache Flink is the 4G of Big Data Analytics Frameworks
Why apache Flink is the 4G of Big Data Analytics FrameworksWhy apache Flink is the 4G of Big Data Analytics Frameworks
Why apache Flink is the 4G of Big Data Analytics FrameworksSlim Baltagi
 
Hadoop or Spark: is it an either-or proposition? By Slim Baltagi
Hadoop or Spark: is it an either-or proposition? By Slim BaltagiHadoop or Spark: is it an either-or proposition? By Slim Baltagi
Hadoop or Spark: is it an either-or proposition? By Slim BaltagiSlim Baltagi
 
Big Data at CME Group: Challenges and Opportunities
Big Data at CME Group: Challenges and Opportunities Big Data at CME Group: Challenges and Opportunities
Big Data at CME Group: Challenges and Opportunities Slim Baltagi
 
Building a Modern Data Architecture with Enterprise Hadoop
Building a Modern Data Architecture with Enterprise HadoopBuilding a Modern Data Architecture with Enterprise Hadoop
Building a Modern Data Architecture with Enterprise HadoopSlim Baltagi
 
Transitioning Compute Models: Hadoop MapReduce to Spark
Transitioning Compute Models: Hadoop MapReduce to SparkTransitioning Compute Models: Hadoop MapReduce to Spark
Transitioning Compute Models: Hadoop MapReduce to SparkSlim Baltagi
 
A Big Data Journey: Bringing Open Source to Finance
A Big Data Journey: Bringing Open Source to FinanceA Big Data Journey: Bringing Open Source to Finance
A Big Data Journey: Bringing Open Source to FinanceSlim Baltagi
 

Plus de Slim Baltagi (17)

How to select a modern data warehouse and get the most out of it?
How to select a modern data warehouse and get the most out of it?How to select a modern data warehouse and get the most out of it?
How to select a modern data warehouse and get the most out of it?
 
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-Baltagi
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-BaltagiModern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-Baltagi
Modern-Data-Warehouses-In-The-Cloud-Use-Cases-Slim-Baltagi
 
Modern big data and machine learning in the era of cloud, docker and kubernetes
Modern big data and machine learning in the era of cloud, docker and kubernetesModern big data and machine learning in the era of cloud, docker and kubernetes
Modern big data and machine learning in the era of cloud, docker and kubernetes
 
Building Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaBuilding Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache Kafka
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiasts
 
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summit
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summitAnalysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summit
Analysis-of-Major-Trends-in-big-data-analytics-slim-baltagi-hadoop-summit
 
Apache Fink 1.0: A New Era for Real-World Streaming Analytics
Apache Fink 1.0: A New Era  for Real-World Streaming AnalyticsApache Fink 1.0: A New Era  for Real-World Streaming Analytics
Apache Fink 1.0: A New Era for Real-World Streaming Analytics
 
Overview of Apache Fink: The 4G of Big Data Analytics Frameworks
Overview of Apache Fink: The 4G of Big Data Analytics FrameworksOverview of Apache Fink: The 4G of Big Data Analytics Frameworks
Overview of Apache Fink: The 4G of Big Data Analytics Frameworks
 
Apache Flink: Real-World Use Cases for Streaming Analytics
Apache Flink: Real-World Use Cases for Streaming AnalyticsApache Flink: Real-World Use Cases for Streaming Analytics
Apache Flink: Real-World Use Cases for Streaming Analytics
 
Apache Flink community Update for March 2016 - Slim Baltagi
Apache Flink community Update for March 2016 - Slim BaltagiApache Flink community Update for March 2016 - Slim Baltagi
Apache Flink community Update for March 2016 - Slim Baltagi
 
Unified Batch and Real-Time Stream Processing Using Apache Flink
Unified Batch and Real-Time Stream Processing Using Apache FlinkUnified Batch and Real-Time Stream Processing Using Apache Flink
Unified Batch and Real-Time Stream Processing Using Apache Flink
 
Why apache Flink is the 4G of Big Data Analytics Frameworks
Why apache Flink is the 4G of Big Data Analytics FrameworksWhy apache Flink is the 4G of Big Data Analytics Frameworks
Why apache Flink is the 4G of Big Data Analytics Frameworks
 
Hadoop or Spark: is it an either-or proposition? By Slim Baltagi
Hadoop or Spark: is it an either-or proposition? By Slim BaltagiHadoop or Spark: is it an either-or proposition? By Slim Baltagi
Hadoop or Spark: is it an either-or proposition? By Slim Baltagi
 
Big Data at CME Group: Challenges and Opportunities
Big Data at CME Group: Challenges and Opportunities Big Data at CME Group: Challenges and Opportunities
Big Data at CME Group: Challenges and Opportunities
 
Building a Modern Data Architecture with Enterprise Hadoop
Building a Modern Data Architecture with Enterprise HadoopBuilding a Modern Data Architecture with Enterprise Hadoop
Building a Modern Data Architecture with Enterprise Hadoop
 
Transitioning Compute Models: Hadoop MapReduce to Spark
Transitioning Compute Models: Hadoop MapReduce to SparkTransitioning Compute Models: Hadoop MapReduce to Spark
Transitioning Compute Models: Hadoop MapReduce to Spark
 
A Big Data Journey: Bringing Open Source to Finance
A Big Data Journey: Bringing Open Source to FinanceA Big Data Journey: Bringing Open Source to Finance
A Big Data Journey: Bringing Open Source to Finance
 

Dernier

RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.natarajan8993
 
IMA MSN - Medical Students Network (2).pptx
IMA MSN - Medical Students Network (2).pptxIMA MSN - Medical Students Network (2).pptx
IMA MSN - Medical Students Network (2).pptxdolaknnilon
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Colleen Farrelly
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryJeremy Anderson
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFAAndrei Kaleshka
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhYasamin16
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理e4aez8ss
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
Advanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsAdvanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsVICTOR MAESTRE RAMIREZ
 
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhi
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝DelhiRS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhi
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhijennyeacort
 
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Thomas Poetter
 
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一F sss
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanMYRABACSAFRA2
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxaleedritatuxx
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Seán Kennedy
 
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档208367051
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort servicejennyeacort
 
Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Seán Kennedy
 
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一F La
 

Dernier (20)

RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.
 
IMA MSN - Medical Students Network (2).pptx
IMA MSN - Medical Students Network (2).pptxIMA MSN - Medical Students Network (2).pptx
IMA MSN - Medical Students Network (2).pptx
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024
 
Defining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data StoryDefining Constituents, Data Vizzes and Telling a Data Story
Defining Constituents, Data Vizzes and Telling a Data Story
 
How we prevented account sharing with MFA
How we prevented account sharing with MFAHow we prevented account sharing with MFA
How we prevented account sharing with MFA
 
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhhThiophen Mechanism khhjjjjjjjhhhhhhhhhhh
Thiophen Mechanism khhjjjjjjjhhhhhhhhhhh
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
Advanced Machine Learning for Business Professionals
Advanced Machine Learning for Business ProfessionalsAdvanced Machine Learning for Business Professionals
Advanced Machine Learning for Business Professionals
 
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhi
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝DelhiRS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhi
RS 9000 Call In girls Dwarka Mor (DELHI)⇛9711147426🔝Delhi
 
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
Minimizing AI Hallucinations/Confabulations and the Path towards AGI with Exa...
 
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一
办理学位证加利福尼亚大学洛杉矶分校毕业证,UCLA成绩单原版一比一
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population Mean
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...
 
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
 
Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...
 
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
办理(Vancouver毕业证书)加拿大温哥华岛大学毕业证成绩单原版一比一
 

Step-by-Step Introduction to Apache Flink

Notes de l'éditeur

  1. The following steps assume a UNIX-like environment. For Windows, see Flink on Windows: https://ci.apache.org/projects/flink/flink-docs-master/setup/local_setup.html#flink-on-windows
  2. The following steps assume a UNIX-like environment. For Windows, see Flink on Windows: https://ci.apache.org/projects/flink/flink-docs-master/setup/local_setup.html#flink-on-windows
  3. For Windows, see Flink on Windows: https://ci.apache.org/projects/flink/flink-docs-master/setup/local_setup.html#flink-on-windows
  4. For Windows, see Flink on Windows: https://ci.apache.org/projects/flink/flink-docs-master/setup/local_setup.html#flink-on-windows
  5. This is Slide 5 of http://www.slideshare.net/robertmetzger1/apache-flink-hands-on
  6. This is Slide 5 of http://www.slideshare.net/robertmetzger1/apache-flink-hands-on
  7. We pass the filename and location of this file using the -Dlog4j.configuration= parameter to the JVM.
  8. We pass the filename and location of this file using the -Dlog4j.configuration= parameter to the JVM.