D:\Java\bin\java -Didea.launcher.port=7537 "-Didea.launcher.bin.path=D:\IntelliJ IDEA Community Edition 15.0.4\bin" -classpath C:\Users\Administrator.PC-201512221019\AppData\Local\Temp\classpath18.jar -Dfile.encoding=UTF-8 com.intellij.rt.execution.application.AppMain zkjz.hjr.Streaming.StreamingKMeansExample D:\trainingDir D:\testDir 5 1 2
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/cdh5.7_jars/jars/avro-tools-1.7.6-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/cdh5.7_jars/jars/pig-0.12.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/cdh5.7_jars/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/cdh5.7_jars/jars/slf4j-simple-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
16/05/27 10:01:01 INFO spark.SparkContext: Running Spark version 1.6.0
16/05/27 10:01:02 INFO spark.SecurityManager: Changing view acls to: Administrator
16/05/27 10:01:02 INFO spark.SecurityManager: Changing modify acls to: Administrator
16/05/27 10:01:02 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(Administrator); users with modify permissions: Set(Administrator)
16/05/27 10:01:02 INFO util.Utils: Successfully started service 'sparkDriver' on port 54970.
16/05/27 10:01:02 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/05/27 10:01:02 INFO Remoting: Starting remoting
16/05/27 10:01:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@192.168.8.191:54983]
16/05/27 10:01:03 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@192.168.8.191:54983]
16/05/27 10:01:03 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 54983.
16/05/27 10:01:03 INFO spark.SparkEnv: Registering MapOutputTracker
16/05/27 10:01:03 INFO spark.SparkEnv: Registering BlockManagerMaster
16/05/27 10:01:03 INFO storage.DiskBlockManager: Created local directory at C:\Users\Administrator.PC-201512221019\AppData\Local\Temp\blockmgr-5f0e75f6-5944-4be7-9265-98f7ea66c882
16/05/27 10:01:03 INFO storage.MemoryStore: MemoryStore started with capacity 958.2 MB
16/05/27 10:01:03 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/05/27 10:01:03 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/27 10:01:03 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040:
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/05/27 10:01:03 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/05/27 10:01:03 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
16/05/27 10:01:03 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/05/27 10:01:03 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
16/05/27 10:01:03 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
16/05/27 10:01:03 INFO ui.SparkUI: Started SparkUI at http://192.168.8.191:4041
16/05/27 10:01:03 INFO executor.Executor: Starting executor ID driver on host localhost
16/05/27 10:01:03 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55002.
16/05/27 10:01:03 INFO netty.NettyBlockTransferService: Server created on 55002
16/05/27 10:01:03 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/05/27 10:01:03 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:55002 with 958.2 MB RAM, BlockManagerId(driver, localhost, 55002)
16/05/27 10:01:03 INFO storage.BlockManagerMaster: Registered BlockManager
16/05/27 10:01:04 INFO dstream.FileInputDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.FileInputDStream@34d1450f
16/05/27 10:01:04 INFO dstream.FileInputDStream: Duration for remembering RDDs set to 60000 ms for org.apache.spark.streaming.dstream.FileInputDStream@1ae39f21
16/05/27 10:01:04 INFO dstream.ForEachDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MappedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MappedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.FileInputDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.FileInputDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.FileInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.FileInputDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.FileInputDStream: Remember duration = 60000 ms
16/05/27 10:01:04 INFO dstream.FileInputDStream: Initialized and validated org.apache.spark.streaming.dstream.FileInputDStream@34d1450f
16/05/27 10:01:04 INFO dstream.MappedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MappedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MappedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@614b722
16/05/27 10:01:04 INFO dstream.MappedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MappedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MappedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@3a031c4c
16/05/27 10:01:04 INFO dstream.ForEachDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.ForEachDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.ForEachDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.ForEachDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@2236df55
16/05/27 10:01:04 INFO dstream.ForEachDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MapValuedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MappedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MappedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.MappedDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.FileInputDStream: metadataCleanupDelay = -1
16/05/27 10:01:04 INFO dstream.FileInputDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.FileInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.FileInputDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.FileInputDStream: Remember duration = 60000 ms
16/05/27 10:01:04 INFO dstream.FileInputDStream: Initialized and validated org.apache.spark.streaming.dstream.FileInputDStream@1ae39f21
16/05/27 10:01:04 INFO dstream.MappedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MappedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MappedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@408056ff
16/05/27 10:01:04 INFO dstream.MappedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MappedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MappedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@29cfa06e
16/05/27 10:01:04 INFO dstream.MappedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MappedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MappedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@7e2dab33
16/05/27 10:01:04 INFO dstream.MapValuedDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.MapValuedDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.MapValuedDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.MapValuedDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.MapValuedDStream: Initialized and validated org.apache.spark.streaming.dstream.MapValuedDStream@6842e1fc
16/05/27 10:01:04 INFO dstream.ForEachDStream: Slide time = 5000 ms
16/05/27 10:01:04 INFO dstream.ForEachDStream: Storage level = StorageLevel(false, false, false, false, 1)
16/05/27 10:01:04 INFO dstream.ForEachDStream: Checkpoint interval = null
16/05/27 10:01:04 INFO dstream.ForEachDStream: Remember duration = 5000 ms
16/05/27 10:01:04 INFO dstream.ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@7f029cdd
16/05/27 10:01:04 INFO util.RecurringTimer: Started timer for JobGenerator at time 1464314465000
16/05/27 10:01:04 INFO scheduler.JobGenerator: Started JobGenerator at 1464314465000 ms
16/05/27 10:01:04 INFO scheduler.JobScheduler: Started JobScheduler
16/05/27 10:01:04 INFO streaming.StreamingContext: StreamingContext started
16/05/27 10:01:06 WARN : Your hostname, PC-201512221019 resolves to a loopback/non-reachable address: fe80:0:0:0:c8a7:23ab:cac1:61b6%24, but we couldn't find any external IP address!
16/05/27 10:01:08 INFO dstream.FileInputDStream: Finding new files took 3344 ms
16/05/27 10:01:08 INFO dstream.FileInputDStream: New files at time 1464314465000 ms:
16/05/27 10:01:08 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:08 INFO dstream.FileInputDStream: New files at time 1464314465000 ms:
16/05/27 10:01:08 INFO scheduler.JobScheduler: Added jobs for time 1464314465000 ms
16/05/27 10:01:08 INFO scheduler.JobScheduler: Starting job streaming job 1464314465000 ms.0 from job set of time 1464314465000 ms
16/05/27 10:01:08 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:08 INFO scheduler.DAGScheduler: Job 0 finished: collect at StreamingKMeans.scala:89, took 0.001748 s
16/05/27 10:01:08 WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
16/05/27 10:01:08 WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
16/05/27 10:01:08 INFO scheduler.JobScheduler: Finished job streaming job 1464314465000 ms.0 from job set of time 1464314465000 ms
16/05/27 10:01:08 INFO scheduler.JobScheduler: Starting job streaming job 1464314465000 ms.1 from job set of time 1464314465000 ms
16/05/27 10:01:08 INFO scheduler.JobScheduler: Finished job streaming job 1464314465000 ms.1 from job set of time 1464314465000 ms
16/05/27 10:01:08 INFO scheduler.JobScheduler: Total delay: 3.536 s for time 1464314465000 ms (execution: 0.127 s)
-------------------------------------------
Time: 1464314465000 ms
-------------------------------------------
16/05/27 10:01:08 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314405000 ms:
16/05/27 10:01:08 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314405000 ms:
16/05/27 10:01:08 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:08 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:10 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:10 INFO dstream.FileInputDStream: New files at time 1464314470000 ms:
16/05/27 10:01:10 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:10 INFO dstream.FileInputDStream: New files at time 1464314470000 ms:
16/05/27 10:01:10 INFO scheduler.JobScheduler: Added jobs for time 1464314470000 ms
16/05/27 10:01:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314470000 ms.0 from job set of time 1464314470000 ms
16/05/27 10:01:10 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
-------------------------------------------
16/05/27 10:01:10 INFO scheduler.DAGScheduler: Job 1 finished: collect at StreamingKMeans.scala:89, took 0.000030 s
Time: 1464314470000 ms
16/05/27 10:01:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314470000 ms.0 from job set of time 1464314470000 ms
-------------------------------------------
16/05/27 10:01:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314470000 ms.1 from job set of time 1464314470000 ms
16/05/27 10:01:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314470000 ms.1 from job set of time 1464314470000 ms
16/05/27 10:01:10 INFO scheduler.JobScheduler: Total delay: 0.062 s for time 1464314470000 ms (execution: 0.023 s)
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 2 from persistence list
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 1 from persistence list
16/05/27 10:01:10 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314410000 ms:
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 7 from persistence list
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 6 from persistence list
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 5 from persistence list
16/05/27 10:01:10 INFO rdd.MapPartitionsRDD: Removing RDD 4 from persistence list
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 5
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 2
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 1
16/05/27 10:01:10 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314410000 ms:
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 4
16/05/27 10:01:10 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:10 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 6
16/05/27 10:01:10 INFO storage.BlockManager: Removing RDD 7
16/05/27 10:01:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:15 INFO dstream.FileInputDStream: New files at time 1464314475000 ms:
16/05/27 10:01:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:15 INFO dstream.FileInputDStream: New files at time 1464314475000 ms:
16/05/27 10:01:15 INFO scheduler.JobScheduler: Added jobs for time 1464314475000 ms
16/05/27 10:01:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314475000 ms.0 from job set of time 1464314475000 ms
16/05/27 10:01:15 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:15 INFO scheduler.DAGScheduler: Job 2 finished: collect at StreamingKMeans.scala:89, took 0.000028 s
16/05/27 10:01:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314475000 ms.0 from job set of time 1464314475000 ms
-------------------------------------------
Time: 1464314475000 ms
-------------------------------------------
16/05/27 10:01:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314475000 ms.1 from job set of time 1464314475000 ms
16/05/27 10:01:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314475000 ms.1 from job set of time 1464314475000 ms
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 12 from persistence list
16/05/27 10:01:15 INFO scheduler.JobScheduler: Total delay: 0.161 s for time 1464314475000 ms (execution: 0.028 s)
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 12
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 11 from persistence list
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 11
16/05/27 10:01:15 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314415000 ms:
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 17 from persistence list
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 17
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 16 from persistence list
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 16
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 15 from persistence list
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 15
16/05/27 10:01:15 INFO rdd.MapPartitionsRDD: Removing RDD 14 from persistence list
16/05/27 10:01:15 INFO storage.BlockManager: Removing RDD 14
16/05/27 10:01:15 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314415000 ms:
16/05/27 10:01:15 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:15 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:20 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:20 INFO dstream.FileInputDStream: New files at time 1464314480000 ms:
16/05/27 10:01:20 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:20 INFO dstream.FileInputDStream: New files at time 1464314480000 ms:
16/05/27 10:01:20 INFO scheduler.JobScheduler: Added jobs for time 1464314480000 ms
16/05/27 10:01:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314480000 ms.0 from job set of time 1464314480000 ms
-------------------------------------------
Time: 1464314480000 ms
-------------------------------------------
16/05/27 10:01:20 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:20 INFO scheduler.DAGScheduler: Job 3 finished: collect at StreamingKMeans.scala:89, took 0.000019 s
16/05/27 10:01:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314480000 ms.0 from job set of time 1464314480000 ms
16/05/27 10:01:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314480000 ms.1 from job set of time 1464314480000 ms
16/05/27 10:01:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314480000 ms.1 from job set of time 1464314480000 ms
16/05/27 10:01:20 INFO scheduler.JobScheduler: Total delay: 0.070 s for time 1464314480000 ms (execution: 0.024 s)
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 22 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 22
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 21 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 21
16/05/27 10:01:20 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314420000 ms:
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 27 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 27
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 26 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 26
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 25 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 25
16/05/27 10:01:20 INFO rdd.MapPartitionsRDD: Removing RDD 24 from persistence list
16/05/27 10:01:20 INFO storage.BlockManager: Removing RDD 24
16/05/27 10:01:20 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314420000 ms:
16/05/27 10:01:20 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:20 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:25 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:25 INFO dstream.FileInputDStream: New files at time 1464314485000 ms:
16/05/27 10:01:25 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:25 INFO dstream.FileInputDStream: New files at time 1464314485000 ms:
16/05/27 10:01:25 INFO scheduler.JobScheduler: Added jobs for time 1464314485000 ms
16/05/27 10:01:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314485000 ms.0 from job set of time 1464314485000 ms
16/05/27 10:01:25 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:25 INFO scheduler.DAGScheduler: Job 4 finished: collect at StreamingKMeans.scala:89, took 0.000018 s
16/05/27 10:01:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314485000 ms.0 from job set of time 1464314485000 ms
16/05/27 10:01:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314485000 ms.1 from job set of time 1464314485000 ms
16/05/27 10:01:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314485000 ms.1 from job set of time 1464314485000 ms
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 32 from persistence list
16/05/27 10:01:25 INFO scheduler.JobScheduler: Total delay: 0.048 s for time 1464314485000 ms (execution: 0.017 s)
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 32
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 31 from persistence list
-------------------------------------------
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 31
Time: 1464314485000 ms
16/05/27 10:01:25 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314425000 ms:
-------------------------------------------
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 37 from persistence list
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 37
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 36 from persistence list
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 36
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 35 from persistence list
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 35
16/05/27 10:01:25 INFO rdd.MapPartitionsRDD: Removing RDD 34 from persistence list
16/05/27 10:01:25 INFO storage.BlockManager: Removing RDD 34
16/05/27 10:01:25 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314425000 ms:
16/05/27 10:01:25 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:25 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:30 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:30 INFO dstream.FileInputDStream: New files at time 1464314490000 ms:
16/05/27 10:01:30 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:30 INFO dstream.FileInputDStream: New files at time 1464314490000 ms:
16/05/27 10:01:30 INFO scheduler.JobScheduler: Added jobs for time 1464314490000 ms
16/05/27 10:01:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314490000 ms.0 from job set of time 1464314490000 ms
16/05/27 10:01:30 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:30 INFO scheduler.DAGScheduler: Job 5 finished: collect at StreamingKMeans.scala:89, took 0.000028 s
16/05/27 10:01:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314490000 ms.0 from job set of time 1464314490000 ms
16/05/27 10:01:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314490000 ms.1 from job set of time 1464314490000 ms
16/05/27 10:01:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314490000 ms.1 from job set of time 1464314490000 ms
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 42 from persistence list
16/05/27 10:01:30 INFO scheduler.JobScheduler: Total delay: 0.051 s for time 1464314490000 ms (execution: 0.021 s)
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 41 from persistence list
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 42
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 41
16/05/27 10:01:30 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314430000 ms:
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 47 from persistence list
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 47
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 46 from persistence list
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 46
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 45 from persistence list
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 45
-------------------------------------------
Time: 1464314490000 ms
-------------------------------------------
16/05/27 10:01:30 INFO rdd.MapPartitionsRDD: Removing RDD 44 from persistence list
16/05/27 10:01:30 INFO storage.BlockManager: Removing RDD 44
16/05/27 10:01:30 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314430000 ms:
16/05/27 10:01:30 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:30 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:35 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:35 INFO dstream.FileInputDStream: New files at time 1464314495000 ms:
16/05/27 10:01:35 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:35 INFO dstream.FileInputDStream: New files at time 1464314495000 ms:
16/05/27 10:01:35 INFO scheduler.JobScheduler: Added jobs for time 1464314495000 ms
16/05/27 10:01:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314495000 ms.0 from job set of time 1464314495000 ms
-------------------------------------------
Time: 1464314495000 ms
-------------------------------------------
16/05/27 10:01:35 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:35 INFO scheduler.DAGScheduler: Job 6 finished: collect at StreamingKMeans.scala:89, took 0.000028 s
16/05/27 10:01:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314495000 ms.0 from job set of time 1464314495000 ms
16/05/27 10:01:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314495000 ms.1 from job set of time 1464314495000 ms
16/05/27 10:01:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314495000 ms.1 from job set of time 1464314495000 ms
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 52 from persistence list
16/05/27 10:01:35 INFO scheduler.JobScheduler: Total delay: 0.062 s for time 1464314495000 ms (execution: 0.022 s)
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 52
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 51 from persistence list
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 51
16/05/27 10:01:35 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314435000 ms:
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 57 from persistence list
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 57
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 56 from persistence list
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 56
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 55 from persistence list
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 55
16/05/27 10:01:35 INFO rdd.MapPartitionsRDD: Removing RDD 54 from persistence list
16/05/27 10:01:35 INFO storage.BlockManager: Removing RDD 54
16/05/27 10:01:35 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314435000 ms:
16/05/27 10:01:35 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:35 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:40 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:40 INFO dstream.FileInputDStream: New files at time 1464314500000 ms:
16/05/27 10:01:40 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:40 INFO dstream.FileInputDStream: New files at time 1464314500000 ms:
16/05/27 10:01:40 INFO scheduler.JobScheduler: Added jobs for time 1464314500000 ms
16/05/27 10:01:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314500000 ms.0 from job set of time 1464314500000 ms
-------------------------------------------
Time: 1464314500000 ms
-------------------------------------------
16/05/27 10:01:40 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:40 INFO scheduler.DAGScheduler: Job 7 finished: collect at StreamingKMeans.scala:89, took 0.000027 s
16/05/27 10:01:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314500000 ms.0 from job set of time 1464314500000 ms
16/05/27 10:01:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314500000 ms.1 from job set of time 1464314500000 ms
16/05/27 10:01:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314500000 ms.1 from job set of time 1464314500000 ms
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 62 from persistence list
16/05/27 10:01:40 INFO scheduler.JobScheduler: Total delay: 0.054 s for time 1464314500000 ms (execution: 0.019 s)
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 62
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 61 from persistence list
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 61
16/05/27 10:01:40 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314440000 ms:
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 67 from persistence list
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 67
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 66 from persistence list
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 66
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 65 from persistence list
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 65
16/05/27 10:01:40 INFO rdd.MapPartitionsRDD: Removing RDD 64 from persistence list
16/05/27 10:01:40 INFO storage.BlockManager: Removing RDD 64
16/05/27 10:01:40 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314440000 ms:
16/05/27 10:01:40 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:40 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:45 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:45 INFO dstream.FileInputDStream: New files at time 1464314505000 ms:
16/05/27 10:01:45 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:01:45 INFO dstream.FileInputDStream: New files at time 1464314505000 ms:
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 7
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 6
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 5
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 4
16/05/27 10:01:45 INFO scheduler.JobScheduler: Added jobs for time 1464314505000 ms
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 3
16/05/27 10:01:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314505000 ms.0 from job set of time 1464314505000 ms
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 2
16/05/27 10:01:45 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:45 INFO scheduler.DAGScheduler: Job 8 finished: collect at StreamingKMeans.scala:89, took 0.000028 s
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 1
16/05/27 10:01:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314505000 ms.0 from job set of time 1464314505000 ms
16/05/27 10:01:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314505000 ms.1 from job set of time 1464314505000 ms
16/05/27 10:01:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314505000 ms.1 from job set of time 1464314505000 ms
16/05/27 10:01:45 INFO scheduler.JobScheduler: Total delay: 0.562 s for time 1464314505000 ms (execution: 0.045 s)
-------------------------------------------
Time: 1464314505000 ms
-------------------------------------------
16/05/27 10:01:45 INFO spark.ContextCleaner: Cleaned shuffle 0
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 72 from persistence list
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 71 from persistence list
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 72
16/05/27 10:01:45 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314445000 ms:
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 77 from persistence list
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 77
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 76 from persistence list
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 71
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 76
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 75 from persistence list
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 75
16/05/27 10:01:45 INFO rdd.MapPartitionsRDD: Removing RDD 74 from persistence list
16/05/27 10:01:45 INFO storage.BlockManager: Removing RDD 74
16/05/27 10:01:45 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314445000 ms:
16/05/27 10:01:45 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:45 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:50 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:50 INFO dstream.FileInputDStream: New files at time 1464314510000 ms:
16/05/27 10:01:50 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:01:50 INFO dstream.FileInputDStream: New files at time 1464314510000 ms:
16/05/27 10:01:50 INFO scheduler.JobScheduler: Added jobs for time 1464314510000 ms
16/05/27 10:01:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314510000 ms.0 from job set of time 1464314510000 ms
16/05/27 10:01:50 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:50 INFO scheduler.DAGScheduler: Job 9 finished: collect at StreamingKMeans.scala:89, took 0.000020 s
16/05/27 10:01:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314510000 ms.0 from job set of time 1464314510000 ms
16/05/27 10:01:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314510000 ms.1 from job set of time 1464314510000 ms
16/05/27 10:01:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314510000 ms.1 from job set of time 1464314510000 ms
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 82 from persistence list
16/05/27 10:01:50 INFO scheduler.JobScheduler: Total delay: 0.059 s for time 1464314510000 ms (execution: 0.020 s)
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 82
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 81 from persistence list
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 81
16/05/27 10:01:50 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314450000 ms:
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 87 from persistence list
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 87
-------------------------------------------
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 86 from persistence list
Time: 1464314510000 ms
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 86
-------------------------------------------
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 85 from persistence list
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 85
16/05/27 10:01:50 INFO rdd.MapPartitionsRDD: Removing RDD 84 from persistence list
16/05/27 10:01:50 INFO storage.BlockManager: Removing RDD 84
16/05/27 10:01:50 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314450000 ms:
16/05/27 10:01:50 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:50 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:01:55 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:01:55 INFO dstream.FileInputDStream: New files at time 1464314515000 ms:
16/05/27 10:01:55 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:01:55 INFO dstream.FileInputDStream: New files at time 1464314515000 ms:
16/05/27 10:01:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314515000 ms.0 from job set of time 1464314515000 ms
16/05/27 10:01:55 INFO scheduler.JobScheduler: Added jobs for time 1464314515000 ms
16/05/27 10:01:55 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:01:55 INFO scheduler.DAGScheduler: Job 10 finished: collect at StreamingKMeans.scala:89, took 0.000019 s
16/05/27 10:01:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314515000 ms.0 from job set of time 1464314515000 ms
16/05/27 10:01:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314515000 ms.1 from job set of time 1464314515000 ms
16/05/27 10:01:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314515000 ms.1 from job set of time 1464314515000 ms
16/05/27 10:01:55 INFO scheduler.JobScheduler: Total delay: 0.069 s for time 1464314515000 ms (execution: 0.022 s)
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 92 from persistence list
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 92
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 91 from persistence list
16/05/27 10:01:55 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314455000 ms:
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 97 from persistence list
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 91
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 96 from persistence list
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 97
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 95 from persistence list
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 96
16/05/27 10:01:55 INFO rdd.MapPartitionsRDD: Removing RDD 94 from persistence list
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 95
-------------------------------------------
Time: 1464314515000 ms
-------------------------------------------
16/05/27 10:01:55 INFO storage.BlockManager: Removing RDD 94
16/05/27 10:01:55 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314455000 ms:
16/05/27 10:01:55 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:01:55 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:02:00 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:00 INFO dstream.FileInputDStream: New files at time 1464314520000 ms:
16/05/27 10:02:00 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:00 INFO dstream.FileInputDStream: New files at time 1464314520000 ms:
16/05/27 10:02:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314520000 ms.0 from job set of time 1464314520000 ms
16/05/27 10:02:00 INFO scheduler.JobScheduler: Added jobs for time 1464314520000 ms
-------------------------------------------
Time: 1464314520000 ms
-------------------------------------------
16/05/27 10:02:00 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:02:00 INFO scheduler.DAGScheduler: Job 11 finished: collect at StreamingKMeans.scala:89, took 0.000021 s
16/05/27 10:02:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314520000 ms.0 from job set of time 1464314520000 ms
16/05/27 10:02:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314520000 ms.1 from job set of time 1464314520000 ms
16/05/27 10:02:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314520000 ms.1 from job set of time 1464314520000 ms
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 102 from persistence list
16/05/27 10:02:00 INFO scheduler.JobScheduler: Total delay: 0.058 s for time 1464314520000 ms (execution: 0.011 s)
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 102
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 101 from persistence list
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 101
16/05/27 10:02:00 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314460000 ms:
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 107 from persistence list
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 107
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 106 from persistence list
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 106
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 105 from persistence list
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 105
16/05/27 10:02:00 INFO rdd.MapPartitionsRDD: Removing RDD 104 from persistence list
16/05/27 10:02:00 INFO storage.BlockManager: Removing RDD 104
16/05/27 10:02:00 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314460000 ms:
16/05/27 10:02:00 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:00 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:02:05 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:02:05 INFO dstream.FileInputDStream: New files at time 1464314525000 ms:
16/05/27 10:02:05 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:02:05 INFO dstream.FileInputDStream: New files at time 1464314525000 ms:
16/05/27 10:02:05 INFO scheduler.JobScheduler: Added jobs for time 1464314525000 ms
16/05/27 10:02:05 INFO scheduler.JobScheduler: Starting job streaming job 1464314525000 ms.0 from job set of time 1464314525000 ms
16/05/27 10:02:05 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:02:05 INFO scheduler.DAGScheduler: Job 12 finished: collect at StreamingKMeans.scala:89, took 0.000022 s
16/05/27 10:02:05 INFO scheduler.JobScheduler: Finished job streaming job 1464314525000 ms.0 from job set of time 1464314525000 ms
16/05/27 10:02:05 INFO scheduler.JobScheduler: Starting job streaming job 1464314525000 ms.1 from job set of time 1464314525000 ms
16/05/27 10:02:05 INFO scheduler.JobScheduler: Finished job streaming job 1464314525000 ms.1 from job set of time 1464314525000 ms
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 112 from persistence list
16/05/27 10:02:05 INFO scheduler.JobScheduler: Total delay: 0.064 s for time 1464314525000 ms (execution: 0.022 s)
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 112
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 111 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 111
16/05/27 10:02:05 INFO rdd.UnionRDD: Removing RDD 0 from persistence list
16/05/27 10:02:05 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314465000 ms:
-------------------------------------------
Time: 1464314525000 ms
-------------------------------------------
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 117 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 0
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 117
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 116 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 116
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 115 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 115
16/05/27 10:02:05 INFO rdd.MapPartitionsRDD: Removing RDD 114 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 114
16/05/27 10:02:05 INFO rdd.UnionRDD: Removing RDD 3 from persistence list
16/05/27 10:02:05 INFO storage.BlockManager: Removing RDD 3
16/05/27 10:02:05 INFO dstream.FileInputDStream: Cleared 0 old files that were older than 1464314465000 ms:
16/05/27 10:02:05 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:05 INFO scheduler.InputInfoTracker: remove old batch metadata:
16/05/27 10:02:10 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:02:10 INFO dstream.FileInputDStream: New files at time 1464314530000 ms:
16/05/27 10:02:10 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:10 INFO dstream.FileInputDStream: New files at time 1464314530000 ms:
16/05/27 10:02:10 INFO scheduler.JobScheduler: Added jobs for time 1464314530000 ms
16/05/27 10:02:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314530000 ms.0 from job set of time 1464314530000 ms
16/05/27 10:02:10 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:02:10 INFO scheduler.DAGScheduler: Job 13 finished: collect at StreamingKMeans.scala:89, took 0.000021 s
16/05/27 10:02:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314530000 ms.0 from job set of time 1464314530000 ms
16/05/27 10:02:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314530000 ms.1 from job set of time 1464314530000 ms
16/05/27 10:02:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314530000 ms.1 from job set of time 1464314530000 ms
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 122 from persistence list
16/05/27 10:02:10 INFO scheduler.JobScheduler: Total delay: 0.044 s for time 1464314530000 ms (execution: 0.020 s)
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 122
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 121 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 121
16/05/27 10:02:10 INFO rdd.UnionRDD: Removing RDD 10 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 10
16/05/27 10:02:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314470000 ms: 1464314465000 ms
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 127 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 127
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 126 from persistence list
-------------------------------------------
Time: 1464314530000 ms
-------------------------------------------
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 125 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 126
16/05/27 10:02:10 INFO rdd.MapPartitionsRDD: Removing RDD 124 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 125
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 124
16/05/27 10:02:10 INFO rdd.UnionRDD: Removing RDD 13 from persistence list
16/05/27 10:02:10 INFO storage.BlockManager: Removing RDD 13
16/05/27 10:02:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314470000 ms: 1464314465000 ms
16/05/27 10:02:10 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:10 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314465000 ms
16/05/27 10:02:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:15 INFO dstream.FileInputDStream: New files at time 1464314535000 ms:
16/05/27 10:02:15 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:02:15 INFO dstream.FileInputDStream: New files at time 1464314535000 ms:
16/05/27 10:02:15 INFO scheduler.JobScheduler: Added jobs for time 1464314535000 ms
16/05/27 10:02:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314535000 ms.0 from job set of time 1464314535000 ms
16/05/27 10:02:15 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
-------------------------------------------
Time: 1464314535000 ms
-------------------------------------------
16/05/27 10:02:15 INFO scheduler.DAGScheduler: Job 14 finished: collect at StreamingKMeans.scala:89, took 0.000030 s
16/05/27 10:02:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314535000 ms.0 from job set of time 1464314535000 ms
16/05/27 10:02:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314535000 ms.1 from job set of time 1464314535000 ms
16/05/27 10:02:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314535000 ms.1 from job set of time 1464314535000 ms
16/05/27 10:02:15 INFO scheduler.JobScheduler: Total delay: 0.065 s for time 1464314535000 ms (execution: 0.028 s)
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 132 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 132
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 131 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 131
16/05/27 10:02:15 INFO rdd.UnionRDD: Removing RDD 20 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 20
16/05/27 10:02:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314475000 ms: 1464314470000 ms
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 137 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 137
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 136 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 136
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 135 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 135
16/05/27 10:02:15 INFO rdd.MapPartitionsRDD: Removing RDD 134 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 134
16/05/27 10:02:15 INFO rdd.UnionRDD: Removing RDD 23 from persistence list
16/05/27 10:02:15 INFO storage.BlockManager: Removing RDD 23
16/05/27 10:02:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314475000 ms: 1464314470000 ms
16/05/27 10:02:15 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:15 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314470000 ms
16/05/27 10:02:20 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:02:20 INFO dstream.FileInputDStream: New files at time 1464314540000 ms:
16/05/27 10:02:20 INFO dstream.FileInputDStream: Finding new files took 3 ms
16/05/27 10:02:20 INFO dstream.FileInputDStream: New files at time 1464314540000 ms:
16/05/27 10:02:20 INFO scheduler.JobScheduler: Added jobs for time 1464314540000 ms
16/05/27 10:02:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314540000 ms.0 from job set of time 1464314540000 ms
16/05/27 10:02:20 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:02:20 INFO scheduler.DAGScheduler: Job 15 finished: collect at StreamingKMeans.scala:89, took 0.000021 s
16/05/27 10:02:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314540000 ms.0 from job set of time 1464314540000 ms
16/05/27 10:02:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314540000 ms.1 from job set of time 1464314540000 ms
16/05/27 10:02:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314540000 ms.1 from job set of time 1464314540000 ms
16/05/27 10:02:20 INFO scheduler.JobScheduler: Total delay: 0.060 s for time 1464314540000 ms (execution: 0.022 s)
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 142 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 142
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 141 from persistence list
-------------------------------------------
Time: 1464314540000 ms
-------------------------------------------
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 141
16/05/27 10:02:20 INFO rdd.UnionRDD: Removing RDD 30 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 30
16/05/27 10:02:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314480000 ms: 1464314475000 ms
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 147 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 147
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 146 from persistence list
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 145 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 146
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 145
16/05/27 10:02:20 INFO rdd.MapPartitionsRDD: Removing RDD 144 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 144
16/05/27 10:02:20 INFO rdd.UnionRDD: Removing RDD 33 from persistence list
16/05/27 10:02:20 INFO storage.BlockManager: Removing RDD 33
16/05/27 10:02:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314480000 ms: 1464314475000 ms
16/05/27 10:02:20 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:20 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314475000 ms
16/05/27 10:02:25 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:25 INFO dstream.FileInputDStream: New files at time 1464314545000 ms:
16/05/27 10:02:25 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:02:25 INFO dstream.FileInputDStream: New files at time 1464314545000 ms:
16/05/27 10:02:25 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:02:25 INFO scheduler.DAGScheduler: Job 16 finished: collect at StreamingKMeans.scala:89, took 0.000025 s
16/05/27 10:02:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314545000 ms.0 from job set of time 1464314545000 ms
16/05/27 10:02:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314545000 ms.0 from job set of time 1464314545000 ms
16/05/27 10:02:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314545000 ms.1 from job set of time 1464314545000 ms
16/05/27 10:02:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314545000 ms.1 from job set of time 1464314545000 ms
16/05/27 10:02:25 INFO scheduler.JobScheduler: Total delay: 0.065 s for time 1464314545000 ms (execution: 0.001 s)
16/05/27 10:02:25 INFO scheduler.JobScheduler: Added jobs for time 1464314545000 ms
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 152 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 152
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 151 from persistence list
-------------------------------------------
Time: 1464314545000 ms
-------------------------------------------
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 151
16/05/27 10:02:25 INFO rdd.UnionRDD: Removing RDD 40 from persistence list
16/05/27 10:02:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314485000 ms: 1464314480000 ms
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 157 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 40
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 156 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 157
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 155 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 156
16/05/27 10:02:25 INFO rdd.MapPartitionsRDD: Removing RDD 154 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 155
16/05/27 10:02:25 INFO rdd.UnionRDD: Removing RDD 43 from persistence list
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 154
16/05/27 10:02:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314485000 ms: 1464314480000 ms
16/05/27 10:02:25 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:02:25 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314480000 ms
16/05/27 10:02:25 INFO storage.BlockManager: Removing RDD 43
16/05/27 10:02:30 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:30 INFO dstream.FileInputDStream: New files at time 1464314550000 ms:
16/05/27 10:02:30 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:02:30 INFO dstream.FileInputDStream: New files at time 1464314550000 ms:
16/05/27 10:02:30 INFO scheduler.JobScheduler: Added jobs for time 1464314550000 ms
16/05/27 10:02:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314550000 ms.0 from job set of time 1464314550000 ms
-------------------------------------------
Time: 1464314550000 ms
-------------------------------------------
16/05/27 10:03:10 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:10 INFO scheduler.DAGScheduler: Job 25 finished: collect at StreamingKMeans.scala:89, took 0.000019 s
16/05/27 10:03:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314590000 ms.0 from job set of time 1464314590000 ms
16/05/27 10:03:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314590000 ms.1 from job set of time 1464314590000 ms
16/05/27 10:03:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314590000 ms.1 from job set of time 1464314590000 ms
16/05/27 10:03:10 INFO scheduler.JobScheduler: Total delay: 0.046 s for time 1464314590000 ms (execution: 0.018 s)
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 242 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 242
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 241 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 241
16/05/27 10:03:10 INFO rdd.UnionRDD: Removing RDD 130 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 130
16/05/27 10:03:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314530000 ms: 1464314525000 ms
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 247 from persistence list
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 246 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 247
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 246
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 245 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 245
16/05/27 10:03:10 INFO rdd.MapPartitionsRDD: Removing RDD 244 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 244
16/05/27 10:03:10 INFO rdd.UnionRDD: Removing RDD 133 from persistence list
16/05/27 10:03:10 INFO storage.BlockManager: Removing RDD 133
16/05/27 10:03:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314530000 ms: 1464314525000 ms
16/05/27 10:03:10 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:10 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314525000 ms
16/05/27 10:03:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:15 INFO dstream.FileInputDStream: New files at time 1464314595000 ms:
16/05/27 10:03:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:15 INFO dstream.FileInputDStream: New files at time 1464314595000 ms:
16/05/27 10:03:15 INFO scheduler.JobScheduler: Added jobs for time 1464314595000 ms
16/05/27 10:03:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314595000 ms.0 from job set of time 1464314595000 ms
-------------------------------------------
Time: 1464314595000 ms
-------------------------------------------
16/05/27 10:03:15 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:15 INFO scheduler.DAGScheduler: Job 26 finished: collect at StreamingKMeans.scala:89, took 0.000032 s
16/05/27 10:03:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314595000 ms.0 from job set of time 1464314595000 ms
16/05/27 10:03:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314595000 ms.1 from job set of time 1464314595000 ms
16/05/27 10:03:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314595000 ms.1 from job set of time 1464314595000 ms
16/05/27 10:03:15 INFO scheduler.JobScheduler: Total delay: 0.055 s for time 1464314595000 ms (execution: 0.020 s)
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 252 from persistence list
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 251 from persistence list
16/05/27 10:03:15 INFO rdd.UnionRDD: Removing RDD 140 from persistence list
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 252
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 251
16/05/27 10:03:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314535000 ms: 1464314530000 ms
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 257 from persistence list
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 140
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 257
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 256 from persistence list
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 256
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 255 from persistence list
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 255
16/05/27 10:03:15 INFO rdd.MapPartitionsRDD: Removing RDD 254 from persistence list
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 254
16/05/27 10:03:15 INFO rdd.UnionRDD: Removing RDD 143 from persistence list
16/05/27 10:03:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314535000 ms: 1464314530000 ms
16/05/27 10:03:15 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:15 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314530000 ms
16/05/27 10:03:15 INFO storage.BlockManager: Removing RDD 143
16/05/27 10:03:20 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:20 INFO dstream.FileInputDStream: New files at time 1464314600000 ms:
16/05/27 10:03:20 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:03:20 INFO dstream.FileInputDStream: New files at time 1464314600000 ms:
-------------------------------------------
Time: 1464314600000 ms
-------------------------------------------
16/05/27 10:03:20 INFO scheduler.JobScheduler: Added jobs for time 1464314600000 ms
16/05/27 10:03:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314600000 ms.0 from job set of time 1464314600000 ms
16/05/27 10:03:20 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:20 INFO scheduler.DAGScheduler: Job 27 finished: collect at StreamingKMeans.scala:89, took 0.000026 s
16/05/27 10:03:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314600000 ms.0 from job set of time 1464314600000 ms
16/05/27 10:03:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314600000 ms.1 from job set of time 1464314600000 ms
16/05/27 10:03:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314600000 ms.1 from job set of time 1464314600000 ms
16/05/27 10:03:20 INFO scheduler.JobScheduler: Total delay: 0.065 s for time 1464314600000 ms (execution: 0.018 s)
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 262 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 262
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 261 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 261
16/05/27 10:03:20 INFO rdd.UnionRDD: Removing RDD 150 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 150
16/05/27 10:03:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314540000 ms: 1464314535000 ms
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 267 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 267
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 266 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 266
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 265 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 265
16/05/27 10:03:20 INFO rdd.MapPartitionsRDD: Removing RDD 264 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 264
16/05/27 10:03:20 INFO rdd.UnionRDD: Removing RDD 153 from persistence list
16/05/27 10:03:20 INFO storage.BlockManager: Removing RDD 153
16/05/27 10:03:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314540000 ms: 1464314535000 ms
16/05/27 10:03:20 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:20 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314535000 ms
16/05/27 10:03:25 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:25 INFO dstream.FileInputDStream: New files at time 1464314605000 ms:
16/05/27 10:03:25 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:03:25 INFO dstream.FileInputDStream: New files at time 1464314605000 ms:
16/05/27 10:03:25 INFO scheduler.JobScheduler: Added jobs for time 1464314605000 ms
16/05/27 10:03:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314605000 ms.0 from job set of time 1464314605000 ms
-------------------------------------------
Time: 1464314605000 ms
-------------------------------------------
16/05/27 10:03:25 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:25 INFO scheduler.DAGScheduler: Job 28 finished: collect at StreamingKMeans.scala:89, took 0.000022 s
16/05/27 10:03:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314605000 ms.0 from job set of time 1464314605000 ms
16/05/27 10:03:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314605000 ms.1 from job set of time 1464314605000 ms
16/05/27 10:03:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314605000 ms.1 from job set of time 1464314605000 ms
16/05/27 10:03:25 INFO scheduler.JobScheduler: Total delay: 0.029 s for time 1464314605000 ms (execution: 0.013 s)
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 272 from persistence list
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 271 from persistence list
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 272
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 271
16/05/27 10:03:25 INFO rdd.UnionRDD: Removing RDD 160 from persistence list
16/05/27 10:03:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314545000 ms: 1464314540000 ms
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 277 from persistence list
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 160
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 277
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 276 from persistence list
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 276
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 275 from persistence list
16/05/27 10:03:25 INFO rdd.MapPartitionsRDD: Removing RDD 274 from persistence list
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 275
16/05/27 10:03:25 INFO rdd.UnionRDD: Removing RDD 163 from persistence list
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 274
16/05/27 10:03:25 INFO storage.BlockManager: Removing RDD 163
16/05/27 10:03:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314545000 ms: 1464314540000 ms
16/05/27 10:03:25 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:25 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314540000 ms
16/05/27 10:03:30 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:30 INFO dstream.FileInputDStream: New files at time 1464314610000 ms:
16/05/27 10:03:30 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:03:30 INFO dstream.FileInputDStream: New files at time 1464314610000 ms:
16/05/27 10:03:30 INFO scheduler.JobScheduler: Added jobs for time 1464314610000 ms
16/05/27 10:03:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314610000 ms.0 from job set of time 1464314610000 ms
-------------------------------------------
Time: 1464314610000 ms
-------------------------------------------
16/05/27 10:03:30 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:30 INFO scheduler.DAGScheduler: Job 29 finished: collect at StreamingKMeans.scala:89, took 0.000023 s
16/05/27 10:03:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314610000 ms.0 from job set of time 1464314610000 ms
16/05/27 10:03:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314610000 ms.1 from job set of time 1464314610000 ms
16/05/27 10:03:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314610000 ms.1 from job set of time 1464314610000 ms
16/05/27 10:03:30 INFO scheduler.JobScheduler: Total delay: 0.030 s for time 1464314610000 ms (execution: 0.014 s)
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 282 from persistence list
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 281 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 282
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 281
16/05/27 10:03:30 INFO rdd.UnionRDD: Removing RDD 170 from persistence list
16/05/27 10:03:30 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314550000 ms: 1464314545000 ms
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 287 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 170
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 286 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 287
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 285 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 286
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 285
16/05/27 10:03:30 INFO rdd.MapPartitionsRDD: Removing RDD 284 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 284
16/05/27 10:03:30 INFO rdd.UnionRDD: Removing RDD 173 from persistence list
16/05/27 10:03:30 INFO storage.BlockManager: Removing RDD 173
16/05/27 10:03:30 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314550000 ms: 1464314545000 ms
16/05/27 10:03:30 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:30 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314545000 ms
16/05/27 10:03:35 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:03:35 INFO dstream.FileInputDStream: New files at time 1464314615000 ms:
16/05/27 10:03:35 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:35 INFO dstream.FileInputDStream: New files at time 1464314615000 ms:
16/05/27 10:03:35 INFO scheduler.JobScheduler: Added jobs for time 1464314615000 ms
16/05/27 10:03:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314615000 ms.0 from job set of time 1464314615000 ms
16/05/27 10:03:35 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:35 INFO scheduler.DAGScheduler: Job 30 finished: collect at StreamingKMeans.scala:89, took 0.000025 s
16/05/27 10:03:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314615000 ms.0 from job set of time 1464314615000 ms
16/05/27 10:03:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314615000 ms.1 from job set of time 1464314615000 ms
-------------------------------------------
Time: 1464314615000 ms
-------------------------------------------
16/05/27 10:03:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314615000 ms.1 from job set of time 1464314615000 ms
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 292 from persistence list
16/05/27 10:03:35 INFO scheduler.JobScheduler: Total delay: 0.051 s for time 1464314615000 ms (execution: 0.017 s)
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 292
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 291 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 291
16/05/27 10:03:35 INFO rdd.UnionRDD: Removing RDD 180 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 180
16/05/27 10:03:35 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314555000 ms: 1464314550000 ms
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 297 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 297
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 296 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 296
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 295 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 295
16/05/27 10:03:35 INFO rdd.MapPartitionsRDD: Removing RDD 294 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 294
16/05/27 10:03:35 INFO rdd.UnionRDD: Removing RDD 183 from persistence list
16/05/27 10:03:35 INFO storage.BlockManager: Removing RDD 183
16/05/27 10:03:35 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314555000 ms: 1464314550000 ms
16/05/27 10:03:35 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:35 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314550000 ms
16/05/27 10:03:40 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:40 INFO dstream.FileInputDStream: New files at time 1464314620000 ms:
16/05/27 10:03:40 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:40 INFO dstream.FileInputDStream: New files at time 1464314620000 ms:
16/05/27 10:03:40 INFO scheduler.JobScheduler: Added jobs for time 1464314620000 ms
16/05/27 10:03:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314620000 ms.0 from job set of time 1464314620000 ms
16/05/27 10:03:40 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:40 INFO scheduler.DAGScheduler: Job 31 finished: collect at StreamingKMeans.scala:89, took 0.000020 s
16/05/27 10:03:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314620000 ms.0 from job set of time 1464314620000 ms
16/05/27 10:03:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314620000 ms.1 from job set of time 1464314620000 ms
16/05/27 10:03:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314620000 ms.1 from job set of time 1464314620000 ms
16/05/27 10:03:40 INFO scheduler.JobScheduler: Total delay: 0.047 s for time 1464314620000 ms (execution: 0.014 s)
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 302 from persistence list
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 301 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 302
16/05/27 10:03:40 INFO rdd.UnionRDD: Removing RDD 190 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 301
-------------------------------------------
Time: 1464314620000 ms
-------------------------------------------
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 190
16/05/27 10:03:40 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314560000 ms: 1464314555000 ms
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 307 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 307
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 306 from persistence list
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 305 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 306
16/05/27 10:03:40 INFO rdd.MapPartitionsRDD: Removing RDD 304 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 305
16/05/27 10:03:40 INFO rdd.UnionRDD: Removing RDD 193 from persistence list
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 304
16/05/27 10:03:40 INFO storage.BlockManager: Removing RDD 193
16/05/27 10:03:40 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314560000 ms: 1464314555000 ms
16/05/27 10:03:40 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:40 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314555000 ms
16/05/27 10:03:45 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:03:45 INFO dstream.FileInputDStream: New files at time 1464314625000 ms:
16/05/27 10:03:45 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:45 INFO dstream.FileInputDStream: New files at time 1464314625000 ms:
16/05/27 10:03:45 INFO scheduler.JobScheduler: Added jobs for time 1464314625000 ms
16/05/27 10:03:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314625000 ms.0 from job set of time 1464314625000 ms
16/05/27 10:03:45 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:45 INFO scheduler.DAGScheduler: Job 32 finished: collect at StreamingKMeans.scala:89, took 0.000026 s
16/05/27 10:03:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314625000 ms.0 from job set of time 1464314625000 ms
16/05/27 10:03:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314625000 ms.1 from job set of time 1464314625000 ms
16/05/27 10:03:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314625000 ms.1 from job set of time 1464314625000 ms
16/05/27 10:03:45 INFO scheduler.JobScheduler: Total delay: 0.053 s for time 1464314625000 ms (execution: 0.022 s)
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 312 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 312
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 311 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 311
16/05/27 10:03:45 INFO rdd.UnionRDD: Removing RDD 200 from persistence list
-------------------------------------------
Time: 1464314625000 ms
-------------------------------------------
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 200
16/05/27 10:03:45 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314565000 ms: 1464314560000 ms
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 317 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 317
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 316 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 316
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 315 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 315
16/05/27 10:03:45 INFO rdd.MapPartitionsRDD: Removing RDD 314 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 314
16/05/27 10:03:45 INFO rdd.UnionRDD: Removing RDD 203 from persistence list
16/05/27 10:03:45 INFO storage.BlockManager: Removing RDD 203
16/05/27 10:03:45 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314565000 ms: 1464314560000 ms
16/05/27 10:03:45 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:45 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314560000 ms
16/05/27 10:03:50 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:03:50 INFO dstream.FileInputDStream: New files at time 1464314630000 ms:
16/05/27 10:03:50 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:50 INFO dstream.FileInputDStream: New files at time 1464314630000 ms:
16/05/27 10:03:50 INFO scheduler.JobScheduler: Added jobs for time 1464314630000 ms
16/05/27 10:03:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314630000 ms.0 from job set of time 1464314630000 ms
16/05/27 10:03:50 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:50 INFO scheduler.DAGScheduler: Job 33 finished: collect at StreamingKMeans.scala:89, took 0.000019 s
16/05/27 10:03:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314630000 ms.0 from job set of time 1464314630000 ms
16/05/27 10:03:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314630000 ms.1 from job set of time 1464314630000 ms
16/05/27 10:03:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314630000 ms.1 from job set of time 1464314630000 ms
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 322 from persistence list
16/05/27 10:03:50 INFO scheduler.JobScheduler: Total delay: 0.036 s for time 1464314630000 ms (execution: 0.013 s)
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 322
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 321 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 321
16/05/27 10:03:50 INFO rdd.UnionRDD: Removing RDD 210 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 210
16/05/27 10:03:50 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314570000 ms: 1464314565000 ms
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 327 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 327
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 326 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 326
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 325 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 325
16/05/27 10:03:50 INFO rdd.MapPartitionsRDD: Removing RDD 324 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 324
16/05/27 10:03:50 INFO rdd.UnionRDD: Removing RDD 213 from persistence list
16/05/27 10:03:50 INFO storage.BlockManager: Removing RDD 213
-------------------------------------------
Time: 1464314630000 ms
-------------------------------------------
16/05/27 10:03:50 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314570000 ms: 1464314565000 ms
16/05/27 10:03:50 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:50 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314565000 ms
16/05/27 10:03:55 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:03:55 INFO dstream.FileInputDStream: New files at time 1464314635000 ms:
16/05/27 10:03:55 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:03:55 INFO dstream.FileInputDStream: New files at time 1464314635000 ms:
16/05/27 10:03:55 INFO scheduler.JobScheduler: Added jobs for time 1464314635000 ms
16/05/27 10:03:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314635000 ms.0 from job set of time 1464314635000 ms
16/05/27 10:03:55 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:03:55 INFO scheduler.DAGScheduler: Job 34 finished: collect at StreamingKMeans.scala:89, took 0.000021 s
16/05/27 10:03:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314635000 ms.0 from job set of time 1464314635000 ms
16/05/27 10:03:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314635000 ms.1 from job set of time 1464314635000 ms
16/05/27 10:03:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314635000 ms.1 from job set of time 1464314635000 ms
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 332 from persistence list
16/05/27 10:03:55 INFO scheduler.JobScheduler: Total delay: 0.041 s for time 1464314635000 ms (execution: 0.018 s)
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 332
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 331 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 331
16/05/27 10:03:55 INFO rdd.UnionRDD: Removing RDD 220 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 220
16/05/27 10:03:55 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314575000 ms: 1464314570000 ms
-------------------------------------------
Time: 1464314635000 ms
-------------------------------------------
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 337 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 337
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 336 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 336
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 335 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 335
16/05/27 10:03:55 INFO rdd.MapPartitionsRDD: Removing RDD 334 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 334
16/05/27 10:03:55 INFO rdd.UnionRDD: Removing RDD 223 from persistence list
16/05/27 10:03:55 INFO storage.BlockManager: Removing RDD 223
16/05/27 10:03:55 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314575000 ms: 1464314570000 ms
16/05/27 10:03:55 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:03:55 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314570000 ms
16/05/27 10:04:00 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:00 INFO dstream.FileInputDStream: New files at time 1464314640000 ms:
16/05/27 10:04:00 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:04:00 INFO dstream.FileInputDStream: New files at time 1464314640000 ms:
16/05/27 10:04:00 INFO scheduler.JobScheduler: Added jobs for time 1464314640000 ms
16/05/27 10:04:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314640000 ms.0 from job set of time 1464314640000 ms
16/05/27 10:04:00 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:00 INFO scheduler.DAGScheduler: Job 35 finished: collect at StreamingKMeans.scala:89, took 0.000023 s
16/05/27 10:04:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314640000 ms.0 from job set of time 1464314640000 ms
16/05/27 10:04:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314640000 ms.1 from job set of time 1464314640000 ms
16/05/27 10:04:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314640000 ms.1 from job set of time 1464314640000 ms
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 342 from persistence list
16/05/27 10:04:00 INFO scheduler.JobScheduler: Total delay: 0.048 s for time 1464314640000 ms (execution: 0.017 s)
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 342
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 341 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 341
16/05/27 10:04:00 INFO rdd.UnionRDD: Removing RDD 230 from persistence list
16/05/27 10:04:00 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314580000 ms: 1464314575000 ms
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 347 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 230
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 347
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 346 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 346
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 345 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 345
16/05/27 10:04:00 INFO rdd.MapPartitionsRDD: Removing RDD 344 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 344
16/05/27 10:04:00 INFO rdd.UnionRDD: Removing RDD 233 from persistence list
16/05/27 10:04:00 INFO storage.BlockManager: Removing RDD 233
16/05/27 10:04:00 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314580000 ms: 1464314575000 ms
16/05/27 10:04:00 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:00 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314575000 ms
-------------------------------------------
Time: 1464314640000 ms
-------------------------------------------
16/05/27 10:04:05 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:05 INFO dstream.FileInputDStream: New files at time 1464314645000 ms:
16/05/27 10:04:05 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:05 INFO dstream.FileInputDStream: New files at time 1464314645000 ms:
16/05/27 10:04:05 INFO scheduler.JobScheduler: Added jobs for time 1464314645000 ms
16/05/27 10:04:05 INFO scheduler.JobScheduler: Starting job streaming job 1464314645000 ms.0 from job set of time 1464314645000 ms
16/05/27 10:04:05 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:05 INFO scheduler.DAGScheduler: Job 36 finished: collect at StreamingKMeans.scala:89, took 0.000018 s
16/05/27 10:04:05 INFO scheduler.JobScheduler: Finished job streaming job 1464314645000 ms.0 from job set of time 1464314645000 ms
16/05/27 10:04:05 INFO scheduler.JobScheduler: Starting job streaming job 1464314645000 ms.1 from job set of time 1464314645000 ms
16/05/27 10:04:05 INFO scheduler.JobScheduler: Finished job streaming job 1464314645000 ms.1 from job set of time 1464314645000 ms
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 352 from persistence list
16/05/27 10:04:05 INFO scheduler.JobScheduler: Total delay: 0.044 s for time 1464314645000 ms (execution: 0.012 s)
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 352
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 351 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 351
-------------------------------------------
Time: 1464314645000 ms
16/05/27 10:04:05 INFO rdd.UnionRDD: Removing RDD 240 from persistence list
-------------------------------------------
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 240
16/05/27 10:04:05 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314585000 ms: 1464314580000 ms
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 357 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 357
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 356 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 356
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 355 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 355
16/05/27 10:04:05 INFO rdd.MapPartitionsRDD: Removing RDD 354 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 354
16/05/27 10:04:05 INFO rdd.UnionRDD: Removing RDD 243 from persistence list
16/05/27 10:04:05 INFO storage.BlockManager: Removing RDD 243
16/05/27 10:04:05 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314585000 ms: 1464314580000 ms
16/05/27 10:04:05 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:05 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314580000 ms
16/05/27 10:04:10 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:04:10 INFO dstream.FileInputDStream: New files at time 1464314650000 ms:
16/05/27 10:04:10 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:10 INFO dstream.FileInputDStream: New files at time 1464314650000 ms:
16/05/27 10:04:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314650000 ms.0 from job set of time 1464314650000 ms
16/05/27 10:04:10 INFO scheduler.JobScheduler: Added jobs for time 1464314650000 ms
-------------------------------------------
Time: 1464314650000 ms
-------------------------------------------
16/05/27 10:04:10 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:10 INFO scheduler.DAGScheduler: Job 37 finished: collect at StreamingKMeans.scala:89, took 0.000023 s
16/05/27 10:04:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314650000 ms.0 from job set of time 1464314650000 ms
16/05/27 10:04:10 INFO scheduler.JobScheduler: Starting job streaming job 1464314650000 ms.1 from job set of time 1464314650000 ms
16/05/27 10:04:10 INFO scheduler.JobScheduler: Finished job streaming job 1464314650000 ms.1 from job set of time 1464314650000 ms
16/05/27 10:04:10 INFO scheduler.JobScheduler: Total delay: 0.053 s for time 1464314650000 ms (execution: 0.014 s)
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 362 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 362
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 361 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 361
16/05/27 10:04:10 INFO rdd.UnionRDD: Removing RDD 250 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 250
16/05/27 10:04:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314590000 ms: 1464314585000 ms
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 367 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 367
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 366 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 366
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 365 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 365
16/05/27 10:04:10 INFO rdd.MapPartitionsRDD: Removing RDD 364 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 364
16/05/27 10:04:10 INFO rdd.UnionRDD: Removing RDD 253 from persistence list
16/05/27 10:04:10 INFO storage.BlockManager: Removing RDD 253
16/05/27 10:04:10 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314590000 ms: 1464314585000 ms
16/05/27 10:04:10 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:10 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314585000 ms
16/05/27 10:04:15 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:15 INFO dstream.FileInputDStream: New files at time 1464314655000 ms:
16/05/27 10:04:15 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:04:15 INFO dstream.FileInputDStream: New files at time 1464314655000 ms:
16/05/27 10:04:15 INFO scheduler.JobScheduler: Added jobs for time 1464314655000 ms
16/05/27 10:04:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314655000 ms.0 from job set of time 1464314655000 ms
-------------------------------------------
Time: 1464314655000 ms
-------------------------------------------
16/05/27 10:04:15 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:15 INFO scheduler.DAGScheduler: Job 38 finished: collect at StreamingKMeans.scala:89, took 0.000026 s
16/05/27 10:04:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314655000 ms.0 from job set of time 1464314655000 ms
16/05/27 10:04:15 INFO scheduler.JobScheduler: Starting job streaming job 1464314655000 ms.1 from job set of time 1464314655000 ms
16/05/27 10:04:15 INFO scheduler.JobScheduler: Finished job streaming job 1464314655000 ms.1 from job set of time 1464314655000 ms
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 372 from persistence list
16/05/27 10:04:15 INFO scheduler.JobScheduler: Total delay: 0.027 s for time 1464314655000 ms (execution: 0.014 s)
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 372
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 371 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 371
16/05/27 10:04:15 INFO rdd.UnionRDD: Removing RDD 260 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 260
16/05/27 10:04:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314595000 ms: 1464314590000 ms
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 377 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 377
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 376 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 376
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 375 from persistence list
16/05/27 10:04:15 INFO rdd.MapPartitionsRDD: Removing RDD 374 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 375
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 374
16/05/27 10:04:15 INFO rdd.UnionRDD: Removing RDD 263 from persistence list
16/05/27 10:04:15 INFO storage.BlockManager: Removing RDD 263
16/05/27 10:04:15 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314595000 ms: 1464314590000 ms
16/05/27 10:04:15 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:15 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314590000 ms
16/05/27 10:04:20 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:20 INFO dstream.FileInputDStream: New files at time 1464314660000 ms:
16/05/27 10:04:20 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:20 INFO dstream.FileInputDStream: New files at time 1464314660000 ms:
16/05/27 10:04:20 INFO scheduler.JobScheduler: Added jobs for time 1464314660000 ms
16/05/27 10:04:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314660000 ms.0 from job set of time 1464314660000 ms
-------------------------------------------
Time: 1464314660000 ms
-------------------------------------------
16/05/27 10:04:20 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:20 INFO scheduler.DAGScheduler: Job 39 finished: collect at StreamingKMeans.scala:89, took 0.000027 s
16/05/27 10:04:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314660000 ms.0 from job set of time 1464314660000 ms
16/05/27 10:04:20 INFO scheduler.JobScheduler: Starting job streaming job 1464314660000 ms.1 from job set of time 1464314660000 ms
16/05/27 10:04:20 INFO scheduler.JobScheduler: Finished job streaming job 1464314660000 ms.1 from job set of time 1464314660000 ms
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 382 from persistence list
16/05/27 10:04:20 INFO scheduler.JobScheduler: Total delay: 0.054 s for time 1464314660000 ms (execution: 0.021 s)
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 382
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 381 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 381
16/05/27 10:04:20 INFO rdd.UnionRDD: Removing RDD 270 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 270
16/05/27 10:04:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314600000 ms: 1464314595000 ms
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 387 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 387
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 386 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 386
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 385 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 385
16/05/27 10:04:20 INFO rdd.MapPartitionsRDD: Removing RDD 384 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 384
16/05/27 10:04:20 INFO rdd.UnionRDD: Removing RDD 273 from persistence list
16/05/27 10:04:20 INFO storage.BlockManager: Removing RDD 273
16/05/27 10:04:20 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314600000 ms: 1464314595000 ms
16/05/27 10:04:20 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:20 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314595000 ms
16/05/27 10:04:25 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:25 INFO dstream.FileInputDStream: New files at time 1464314665000 ms:
16/05/27 10:04:25 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:04:25 INFO dstream.FileInputDStream: New files at time 1464314665000 ms:
16/05/27 10:04:25 INFO scheduler.JobScheduler: Added jobs for time 1464314665000 ms
16/05/27 10:04:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314665000 ms.0 from job set of time 1464314665000 ms
16/05/27 10:04:25 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:25 INFO scheduler.DAGScheduler: Job 40 finished: collect at StreamingKMeans.scala:89, took 0.000027 s
16/05/27 10:04:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314665000 ms.0 from job set of time 1464314665000 ms
16/05/27 10:04:25 INFO scheduler.JobScheduler: Starting job streaming job 1464314665000 ms.1 from job set of time 1464314665000 ms
16/05/27 10:04:25 INFO scheduler.JobScheduler: Finished job streaming job 1464314665000 ms.1 from job set of time 1464314665000 ms
16/05/27 10:04:25 INFO scheduler.JobScheduler: Total delay: 0.035 s for time 1464314665000 ms (execution: 0.019 s)
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 392 from persistence list
-------------------------------------------
Time: 1464314665000 ms
-------------------------------------------
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 392
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 391 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 391
16/05/27 10:04:25 INFO rdd.UnionRDD: Removing RDD 280 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 280
16/05/27 10:04:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314605000 ms: 1464314600000 ms
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 397 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 397
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 396 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 396
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 395 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 395
16/05/27 10:04:25 INFO rdd.MapPartitionsRDD: Removing RDD 394 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 394
16/05/27 10:04:25 INFO rdd.UnionRDD: Removing RDD 283 from persistence list
16/05/27 10:04:25 INFO storage.BlockManager: Removing RDD 283
16/05/27 10:04:25 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314605000 ms: 1464314600000 ms
16/05/27 10:04:25 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:25 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314600000 ms
16/05/27 10:04:30 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:30 INFO dstream.FileInputDStream: New files at time 1464314670000 ms:
16/05/27 10:04:30 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:04:30 INFO dstream.FileInputDStream: New files at time 1464314670000 ms:
16/05/27 10:04:30 INFO scheduler.JobScheduler: Added jobs for time 1464314670000 ms
16/05/27 10:04:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314670000 ms.0 from job set of time 1464314670000 ms
16/05/27 10:04:30 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:30 INFO scheduler.DAGScheduler: Job 41 finished: collect at StreamingKMeans.scala:89, took 0.000028 s
-------------------------------------------
Time: 1464314670000 ms
-------------------------------------------
16/05/27 10:04:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314670000 ms.0 from job set of time 1464314670000 ms
16/05/27 10:04:30 INFO scheduler.JobScheduler: Starting job streaming job 1464314670000 ms.1 from job set of time 1464314670000 ms
16/05/27 10:04:30 INFO scheduler.JobScheduler: Finished job streaming job 1464314670000 ms.1 from job set of time 1464314670000 ms
16/05/27 10:04:30 INFO scheduler.JobScheduler: Total delay: 0.040 s for time 1464314670000 ms (execution: 0.017 s)
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 402 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 402
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 401 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 401
16/05/27 10:04:30 INFO rdd.UnionRDD: Removing RDD 290 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 290
16/05/27 10:04:30 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314610000 ms: 1464314605000 ms
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 407 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 407
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 406 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 406
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 405 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 405
16/05/27 10:04:30 INFO rdd.MapPartitionsRDD: Removing RDD 404 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 404
16/05/27 10:04:30 INFO rdd.UnionRDD: Removing RDD 293 from persistence list
16/05/27 10:04:30 INFO storage.BlockManager: Removing RDD 293
16/05/27 10:04:30 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314610000 ms: 1464314605000 ms
16/05/27 10:04:30 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:30 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314605000 ms
16/05/27 10:04:35 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:04:35 INFO dstream.FileInputDStream: New files at time 1464314675000 ms:
16/05/27 10:04:35 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:35 INFO dstream.FileInputDStream: New files at time 1464314675000 ms:
16/05/27 10:04:35 INFO scheduler.JobScheduler: Added jobs for time 1464314675000 ms
16/05/27 10:04:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314675000 ms.0 from job set of time 1464314675000 ms
-------------------------------------------
Time: 1464314675000 ms
-------------------------------------------
16/05/27 10:04:35 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:35 INFO scheduler.DAGScheduler: Job 42 finished: collect at StreamingKMeans.scala:89, took 0.000020 s
16/05/27 10:04:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314675000 ms.0 from job set of time 1464314675000 ms
16/05/27 10:04:35 INFO scheduler.JobScheduler: Starting job streaming job 1464314675000 ms.1 from job set of time 1464314675000 ms
16/05/27 10:04:35 INFO scheduler.JobScheduler: Finished job streaming job 1464314675000 ms.1 from job set of time 1464314675000 ms
16/05/27 10:04:35 INFO scheduler.JobScheduler: Total delay: 0.071 s for time 1464314675000 ms (execution: 0.039 s)
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 41
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 412 from persistence list
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 40
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 39
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 38
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 411 from persistence list
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 412
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 37
16/05/27 10:04:35 INFO rdd.UnionRDD: Removing RDD 300 from persistence list
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 411
16/05/27 10:04:35 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314615000 ms: 1464314610000 ms
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 36
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 417 from persistence list
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 300
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 35
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 34
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 416 from persistence list
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 417
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 416
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 33
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 415 from persistence list
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 415
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 32
16/05/27 10:04:35 INFO rdd.MapPartitionsRDD: Removing RDD 414 from persistence list
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 31
16/05/27 10:04:35 INFO rdd.UnionRDD: Removing RDD 303 from persistence list
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 30
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 414
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 29
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 28
16/05/27 10:04:35 INFO storage.BlockManager: Removing RDD 303
16/05/27 10:04:35 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314615000 ms: 1464314610000 ms
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 27
16/05/27 10:04:35 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:35 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314610000 ms
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 26
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 25
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 24
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 23
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 22
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 21
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 20
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 19
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 18
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 17
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 16
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 15
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 14
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 13
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 12
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 11
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 10
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 9
16/05/27 10:04:35 INFO spark.ContextCleaner: Cleaned shuffle 8
16/05/27 10:04:40 INFO dstream.FileInputDStream: Finding new files took 5 ms
16/05/27 10:04:40 INFO dstream.FileInputDStream: New files at time 1464314680000 ms:
16/05/27 10:04:40 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:40 INFO dstream.FileInputDStream: New files at time 1464314680000 ms:
16/05/27 10:04:40 INFO scheduler.JobScheduler: Added jobs for time 1464314680000 ms
16/05/27 10:04:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314680000 ms.0 from job set of time 1464314680000 ms
16/05/27 10:04:40 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:40 INFO scheduler.DAGScheduler: Job 43 finished: collect at StreamingKMeans.scala:89, took 0.000023 s
-------------------------------------------
Time: 1464314680000 ms
-------------------------------------------
16/05/27 10:04:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314680000 ms.0 from job set of time 1464314680000 ms
16/05/27 10:04:40 INFO scheduler.JobScheduler: Starting job streaming job 1464314680000 ms.1 from job set of time 1464314680000 ms
16/05/27 10:04:40 INFO scheduler.JobScheduler: Finished job streaming job 1464314680000 ms.1 from job set of time 1464314680000 ms
16/05/27 10:04:40 INFO scheduler.JobScheduler: Total delay: 0.049 s for time 1464314680000 ms (execution: 0.018 s)
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 422 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 422
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 421 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 421
16/05/27 10:04:40 INFO rdd.UnionRDD: Removing RDD 310 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 310
16/05/27 10:04:40 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314620000 ms: 1464314615000 ms
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 427 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 427
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 426 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 426
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 425 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 425
16/05/27 10:04:40 INFO rdd.MapPartitionsRDD: Removing RDD 424 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 424
16/05/27 10:04:40 INFO rdd.UnionRDD: Removing RDD 313 from persistence list
16/05/27 10:04:40 INFO storage.BlockManager: Removing RDD 313
16/05/27 10:04:40 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314620000 ms: 1464314615000 ms
16/05/27 10:04:40 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:40 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314615000 ms
16/05/27 10:04:45 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:45 INFO dstream.FileInputDStream: New files at time 1464314685000 ms:
16/05/27 10:04:45 INFO dstream.FileInputDStream: Finding new files took 0 ms
16/05/27 10:04:45 INFO dstream.FileInputDStream: New files at time 1464314685000 ms:
16/05/27 10:04:45 INFO scheduler.JobScheduler: Added jobs for time 1464314685000 ms
16/05/27 10:04:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314685000 ms.0 from job set of time 1464314685000 ms
-------------------------------------------
Time: 1464314685000 ms
-------------------------------------------
16/05/27 10:04:45 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:45 INFO scheduler.DAGScheduler: Job 44 finished: collect at StreamingKMeans.scala:89, took 0.000035 s
16/05/27 10:04:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314685000 ms.0 from job set of time 1464314685000 ms
16/05/27 10:04:45 INFO scheduler.JobScheduler: Starting job streaming job 1464314685000 ms.1 from job set of time 1464314685000 ms
16/05/27 10:04:45 INFO scheduler.JobScheduler: Finished job streaming job 1464314685000 ms.1 from job set of time 1464314685000 ms
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 432 from persistence list
16/05/27 10:04:45 INFO scheduler.JobScheduler: Total delay: 0.048 s for time 1464314685000 ms (execution: 0.020 s)
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 432
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 431 from persistence list
16/05/27 10:04:45 INFO rdd.UnionRDD: Removing RDD 320 from persistence list
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 431
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 320
16/05/27 10:04:45 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314625000 ms: 1464314620000 ms
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 437 from persistence list
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 437
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 436 from persistence list
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 436
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 435 from persistence list
16/05/27 10:04:45 INFO rdd.MapPartitionsRDD: Removing RDD 434 from persistence list
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 435
16/05/27 10:04:45 INFO rdd.UnionRDD: Removing RDD 323 from persistence list
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 434
16/05/27 10:04:45 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314625000 ms: 1464314620000 ms
16/05/27 10:04:45 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:45 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314620000 ms
16/05/27 10:04:45 INFO storage.BlockManager: Removing RDD 323
16/05/27 10:04:50 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:04:50 INFO dstream.FileInputDStream: New files at time 1464314690000 ms:
16/05/27 10:04:50 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:50 INFO dstream.FileInputDStream: New files at time 1464314690000 ms:
16/05/27 10:04:50 INFO scheduler.JobScheduler: Added jobs for time 1464314690000 ms
16/05/27 10:04:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314690000 ms.0 from job set of time 1464314690000 ms
-------------------------------------------
Time: 1464314690000 ms
-------------------------------------------
16/05/27 10:04:50 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:50 INFO scheduler.DAGScheduler: Job 45 finished: collect at StreamingKMeans.scala:89, took 0.000026 s
16/05/27 10:04:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314690000 ms.0 from job set of time 1464314690000 ms
16/05/27 10:04:50 INFO scheduler.JobScheduler: Starting job streaming job 1464314690000 ms.1 from job set of time 1464314690000 ms
16/05/27 10:04:50 INFO scheduler.JobScheduler: Finished job streaming job 1464314690000 ms.1 from job set of time 1464314690000 ms
16/05/27 10:04:50 INFO scheduler.JobScheduler: Total delay: 0.052 s for time 1464314690000 ms (execution: 0.019 s)
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 442 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 442
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 441 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 441
16/05/27 10:04:50 INFO rdd.UnionRDD: Removing RDD 330 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 330
16/05/27 10:04:50 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314630000 ms: 1464314625000 ms
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 447 from persistence list
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 446 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 447
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 446
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 445 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 445
16/05/27 10:04:50 INFO rdd.MapPartitionsRDD: Removing RDD 444 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 444
16/05/27 10:04:50 INFO rdd.UnionRDD: Removing RDD 333 from persistence list
16/05/27 10:04:50 INFO storage.BlockManager: Removing RDD 333
16/05/27 10:04:50 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314630000 ms: 1464314625000 ms
16/05/27 10:04:50 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:50 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314625000 ms
16/05/27 10:04:55 INFO dstream.FileInputDStream: Finding new files took 2 ms
16/05/27 10:04:55 INFO dstream.FileInputDStream: New files at time 1464314695000 ms:
16/05/27 10:04:55 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:04:55 INFO dstream.FileInputDStream: New files at time 1464314695000 ms:
-------------------------------------------
Time: 1464314695000 ms
-------------------------------------------
16/05/27 10:04:55 INFO scheduler.JobScheduler: Added jobs for time 1464314695000 ms
16/05/27 10:04:55 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:04:55 INFO scheduler.DAGScheduler: Job 46 finished: collect at StreamingKMeans.scala:89, took 0.000022 s
16/05/27 10:04:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314695000 ms.0 from job set of time 1464314695000 ms
16/05/27 10:04:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314695000 ms.0 from job set of time 1464314695000 ms
16/05/27 10:04:55 INFO scheduler.JobScheduler: Starting job streaming job 1464314695000 ms.1 from job set of time 1464314695000 ms
16/05/27 10:04:55 INFO scheduler.JobScheduler: Finished job streaming job 1464314695000 ms.1 from job set of time 1464314695000 ms
16/05/27 10:04:55 INFO scheduler.JobScheduler: Total delay: 0.055 s for time 1464314695000 ms (execution: 0.001 s)
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 452 from persistence list
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 451 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 452
16/05/27 10:04:55 INFO rdd.UnionRDD: Removing RDD 340 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 451
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 340
16/05/27 10:04:55 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314635000 ms: 1464314630000 ms
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 457 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 457
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 456 from persistence list
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 455 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 456
16/05/27 10:04:55 INFO rdd.MapPartitionsRDD: Removing RDD 454 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 455
16/05/27 10:04:55 INFO rdd.UnionRDD: Removing RDD 343 from persistence list
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 454
16/05/27 10:04:55 INFO storage.BlockManager: Removing RDD 343
16/05/27 10:04:55 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314635000 ms: 1464314630000 ms
16/05/27 10:04:55 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:04:55 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314630000 ms
16/05/27 10:05:00 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:05:00 INFO dstream.FileInputDStream: New files at time 1464314700000 ms:
16/05/27 10:05:00 INFO dstream.FileInputDStream: Finding new files took 1 ms
16/05/27 10:05:00 INFO dstream.FileInputDStream: New files at time 1464314700000 ms:
file:/D:/testDir/hjr003.txt
16/05/27 10:05:00 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 184.3 KB, free 184.3 KB)
16/05/27 10:05:00 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 20.5 KB, free 204.7 KB)
16/05/27 10:05:00 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:55002 (size: 20.5 KB, free: 958.2 MB)
16/05/27 10:05:00 INFO spark.SparkContext: Created broadcast 0 from textFileStream at StreamingKMeansExample.scala:27
16/05/27 10:05:00 INFO input.FileInputFormat: Total input paths to process : 1
16/05/27 10:05:00 INFO scheduler.JobScheduler: Added jobs for time 1464314700000 ms
16/05/27 10:05:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314700000 ms.0 from job set of time 1464314700000 ms
16/05/27 10:05:00 INFO spark.SparkContext: Starting job: collect at StreamingKMeans.scala:89
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Job 47 finished: collect at StreamingKMeans.scala:89, took 0.000050 s
16/05/27 10:05:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314700000 ms.0 from job set of time 1464314700000 ms
16/05/27 10:05:00 INFO scheduler.JobScheduler: Starting job streaming job 1464314700000 ms.1 from job set of time 1464314700000 ms
16/05/27 10:05:00 INFO spark.SparkContext: Starting job: print at StreamingKMeansExample.scala:36
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Got job 48 (print at StreamingKMeansExample.scala:36) with 1 output partitions
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (print at StreamingKMeansExample.scala:36)
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Missing parents: List()
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[478] at predictOnValues at StreamingKMeansExample.scala:36), which has no missing parents
16/05/27 10:05:00 INFO storage.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.1 KB, free 208.8 KB)
16/05/27 10:05:00 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.3 KB, free 211.2 KB)
16/05/27 10:05:00 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:55002 (size: 2.3 KB, free: 958.2 MB)
16/05/27 10:05:00 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[478] at predictOnValues at StreamingKMeansExample.scala:36)
16/05/27 10:05:00 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/05/27 10:05:00 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2270 bytes)
16/05/27 10:05:00 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/05/27 10:05:00 INFO rdd.NewHadoopRDD: Input split: file:/D:/testDir/hjr003.txt:0+79
16/05/27 10:05:00 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
16/05/27 10:05:00 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
16/05/27 10:05:00 ERROR scheduler.TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
16/05/27 10:05:00 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/05/27 10:05:00 INFO scheduler.TaskSchedulerImpl: Cancelling stage 0
16/05/27 10:05:00 INFO scheduler.DAGScheduler: ResultStage 0 (print at StreamingKMeansExample.scala:36) failed in 0.140 s
16/05/27 10:05:00 INFO scheduler.DAGScheduler: Job 48 failed: print at StreamingKMeansExample.scala:36, took 0.197546 s
16/05/27 10:05:00 INFO scheduler.JobScheduler: Finished job streaming job 1464314700000 ms.1 from job set of time 1464314700000 ms
16/05/27 10:05:00 INFO scheduler.JobScheduler: Total delay: 0.627 s for time 1464314700000 ms (execution: 0.232 s)
16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 462 from persistence list
16/05/27 10:05:00 ERROR scheduler.JobScheduler: Error running job streaming job 1464314700000 ms.1
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1843)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1856)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1869)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.take(RDD.scala:1302)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$5$1.apply(DStream.scala:768)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$5$1.apply(DStream.scala:767)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
... 3 more
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 462
Exception in thread "main" 16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 461 from persistence list
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1843)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1856)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1869)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
at org.apache.spark.rdd.RDD.take(RDD.scala:1302)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$5$1.apply(DStream.scala:768)
at org.apache.spark.streaming.dstream.DStream$$anonfun$print$2$$anonfun$foreachFunc$5$1.apply(DStream.scala:767)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:221)
at org.apache.spark.mllib.util.MLUtils$.fastSquaredDistance(MLUtils.scala:330)
at org.apache.spark.mllib.clustering.KMeans$.fastSquaredDistance(KMeans.scala:595)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:569)
at org.apache.spark.mllib.clustering.KMeans$$anonfun$findClosest$1.apply(KMeans.scala:563)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:73)
at org.apache.spark.mllib.clustering.KMeans$.findClosest(KMeans.scala:563)
at org.apache.spark.mllib.clustering.KMeansModel.predict(KMeansModel.scala:60)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.mllib.clustering.StreamingKMeans$$anonfun$predictOnValues$1.apply(StreamingKMeans.scala:293)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:755)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.rdd.RDD$$anonfun$take$1$$anonfun$28.apply(RDD.scala:1328)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1869)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
... 3 more
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 461
16/05/27 10:05:00 INFO rdd.UnionRDD: Removing RDD 350 from persistence list
16/05/27 10:05:00 INFO streaming.StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
16/05/27 10:05:00 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314640000 ms: 1464314635000 ms
16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 467 from persistence list
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 350
16/05/27 10:05:00 INFO scheduler.JobGenerator: Stopping JobGenerator immediately
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 467
16/05/27 10:05:00 INFO util.RecurringTimer: Stopped timer for JobGenerator after time 1464314700000
16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 466 from persistence list
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 466
16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 465 from persistence list
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 465
16/05/27 10:05:00 INFO rdd.MapPartitionsRDD: Removing RDD 464 from persistence list
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 464
16/05/27 10:05:00 INFO rdd.UnionRDD: Removing RDD 353 from persistence list
16/05/27 10:05:00 INFO storage.BlockManager: Removing RDD 353
16/05/27 10:05:00 INFO dstream.FileInputDStream: Cleared 1 old files that were older than 1464314640000 ms: 1464314635000 ms
16/05/27 10:05:00 INFO scheduler.ReceivedBlockTracker: Deleting batches ArrayBuffer()
16/05/27 10:05:00 INFO scheduler.InputInfoTracker: remove old batch metadata: 1464314635000 ms
16/05/27 10:05:00 INFO scheduler.JobGenerator: Stopped JobGenerator
16/05/27 10:05:00 INFO scheduler.JobScheduler: Stopped JobScheduler
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/streaming,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/streaming/batch,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/streaming,null}
16/05/27 10:05:00 INFO streaming.StreamingContext: StreamingContext stopped successfully
16/05/27 10:05:00 INFO spark.SparkContext: Invoking stop() from shutdown hook
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/streaming/batch/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/streaming/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/05/27 10:05:00 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/05/27 10:05:00 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.8.191:4041
16/05/27 10:05:00 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/05/27 10:05:00 INFO storage.MemoryStore: MemoryStore cleared
16/05/27 10:05:00 INFO storage.BlockManager: BlockManager stopped
16/05/27 10:05:00 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/05/27 10:05:00 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/05/27 10:05:00 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/05/27 10:05:00 INFO spark.SparkContext: Successfully stopped SparkContext
16/05/27 10:05:00 INFO util.ShutdownHookManager: Shutdown hook called
16/05/27 10:05:00 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/05/27 10:05:00 INFO util.ShutdownHookManager: Deleting directory C:\Users\Administrator.PC-201512221019\AppData\Local\Temp\spark-aab2c360-86ff-4c8f-927d-f3f3b9e28fc9
Process finished with exit code 1