Spark Streaming | Fresco Play
Question 1: For every batch interval, the Driver launches tasks to process a block.
Answer: True
Question 2: Spark Streaming has two categories of sources - Basic sources and Advanced sources.
Answer: True
Question 3: Batch interval is configured at _____.
Answer: creating Spark Streaming Context
Question 4: What does saveAsTextFiles(prefix, [suffix]) do?
Answer: Save this DStream's contents as text files
Question 5: What is a Window Duration/Size?
Answer: Interval at which a certain fold operation is done on top of Dstreams.
Question 6: Who is responsible for keeping track of the Block Ids?
Answer: Block Management Master in the Driver
Question 7: Dstreams are immutable. Choose the right option.
Answer: Yes,Like RDD Dstreams are immutable
Question 8: Which among the following are Basic Sources of Spark Streaming?
Answer: File systems
Question 9: Mllib and Spark SQL can work on top of the data taken up via Spark Streaming.
Answer: True
Question 10: What is a Sliding Interval?
Answer: Interval at which sliding of the window area occur.
Question 11: Which among the following is true about Window Operations?
Answer: All the options
Question 12: Benefits of Discretized Stream Processing are ___________.
Answer: All the options
Question 13: We specify ___________ when we create streaming context.
Answer: batch interval
Question 14: Dstreams are _______.
Answer: Collection of RDD
Question 15: Which among the following can act as a data sink for Spark Streaming?
Answer: All the options
Question 16: What is the strategy taken in order to prevent loss of the incoming stream?
Answer: Data is replicated in different nodes
Question 17: DStream represents a continuous stream of data.
Answer: True
Question 18: What is a batch Interval?
Answer: Interval at which a Dstream is created
Question 19: reduceByKey is a _________.
Answer: Action
Question 20: With Spark Streaming, the incoming data is split into micro batches.
Answer: True
Question 21: Which of the following transformations can be applied to a Dstream?
Answer: All the options
Question 22: What is the programming abstraction in Spark Streaming?
Answer: Dstreams
Question 23: Dstreams are internally _______.
Answer: Collection of RDD
Question 24: Reciever recieves data from the Streaming sources at the start of _________.
Answer: Streaming Context
Question 25: Which among the following is true about Spark Streaming?
Answer: All the options
Question 26: Applying transformations on top of a Dsteam will yield _______.
Answer: A new Dstream
Question 27: We cannot configure Twitter as a data source system for Spark Streaming.
Answer: False
Question 28: Dstreams are internally, a collection of _______.
Answer: RDD
Question 29: HDFS cannot be a sink for Spark Streaming.
Answer: False
Question 30: Which among the following can act as a data source for Spark Streaming?
Answer: All the options
Question 31: Dstreams cannot be created directly from sources such as Kafka and Flume.
Answer: False
Question 32: Dstreams can be created from an existing Dstream.
Answer: True
Question 33: How can a Dstream be created?
Answer: None of the options
Question 34: Spark streaming converts the input data streams into ______.
Answer: micro-batches
Question 35: Internally DStream is represented as a sequence of _____ arriving at discrete time intervals.
Answer: RDD
Question 36: Choose the correct statement.
Answer: All the options
Question 37: The receiver divides the stream into blocks and keeps them in memory.
Answer: True
Question 38: ssc.start() is the entry point for a Streaming application.
Answer: True
Question 39: Starting point of a streaming application is _______.
Answer: ssc.start()
Question 40: Sliding Interval is the interval at which sliding of the window area occur.
Answer: True
Question 41: There can be multiple Dstreams in a single window.
Answer: True
Question 42: When is a batch interval defined?
Answer: creation of Streaming context
Question 43: Data sources for Spark Streaming that comes under the 'Advanced sources' category include ________.
Answer: All the options
Question 44: Spark Streaming can be used for real-time processing of data.
Answer: True
Question 45: Block Management Master keeps track of ___.
Answer: Block id
Question 46: Block Management units in the worker node reports to ____.
Answer: Block Management Master in the Driver
Question 47: Which among the following needs to be a multiple of batch interval?
Answer: All the options
Question 48: The basic programming abstraction of Spark Streaming is _______.
Answer: RDD
Post a comment
Get your FREE PDF on "100 Ways to Try ChatGPT Today"
Generating link, please wait for: 60 seconds
Comments
Join the conversation and share your thoughts! Leave the first comment.