1
votes

I have a dse graph in production.

I have enabled one node with search and analytics through opscenter.

I can successfully launch a gremlin console and run analytic queries with the command:

:remote config alias g graphName.a

The problem occurs when I try to launch a DSEGraph Frame query using the scala spark console.

Whenever I type the initial command in the scala spark console I got the same error.

val g = spark.dseGraph("graphName")

com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
  at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:54)
  at com.datastax.driver.core.exceptions.ServerError.copy(ServerError.java:16)
  at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:28)
  at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:236)
  at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:59)
  at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:42)
  at com.datastax.driver.dse.DefaultDseSession.execute(DefaultDseSession.java:232)
  at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
  at com.sun.proxy.$Proxy6.execute(Unknown Source)
  at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:498)
  at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:40)
  at com.sun.proxy.$Proxy7.execute(Unknown Source)
  at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:42)
  at com.datastax.bdp.graph.spark.DseGraphRpc.callGetSchema(DseGraphRpc.java:47)
  at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:504)
  at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:504)
  at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:112)
  at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111)
  at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:145)
  at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
  at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$.getSchemaFromServer(DseGraphFrame.scala:504)
  at com.datastax.bdp.graph.spark.graphframe.DseGraphFrameBuilder$.apply(DseGraphFrameBuilder.scala:241)
  at com.datastax.bdp.graph.spark.graphframe.SparkSessionFunctions.dseGraph(SparkSessionFunctions.scala:20)
  ... 57 elided
Caused by: com.datastax.driver.core.exceptions.ServerError: An unexpected error occurred server side on /x.x.x.x:9042: Failed to execute method DseGraphRpc.getSchemaBlob
  at com.datastax.driver.core.Responses$Error.asException(Responses.java:114)
  at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:498)
  at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1074)
  at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991)
  at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
  at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
  at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
  at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
  at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1069)
  at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:902)
  at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:411)
  at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:248)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346)
  at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353)
  at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
  at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:934)
  at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:405)
  at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:310)
  at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
  at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
  at java.lang.Thread.run(Thread.java:748)

In the cassandra log file I got this stack trace:

INFO  [Native-Transport-Requests-14] 2018-04-11 17:54:47,248  RpcMethod.java:177 - Failed to execute method DseGraphRpc.getSchemaBlob
java.lang.reflect.InvocationTargetException: null
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_131]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_131]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_131]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_131]
    at com.datastax.bdp.util.rpc.RpcMethod.execute(RpcMethod.java:159) ~[dse-core-5.1.1.jar:5.1.1]
    at com.datastax.bdp.cassandra.cql3.RpcCallStatement.execute(RpcCallStatement.java:92) [dse-core-5.1.1.jar:5.1.1]
    at org.apache.cassandra.cql3.QueryProcessor.processStatement(QueryProcessor.java:218) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at com.datastax.bdp.cassandra.cql3.DseQueryHandler$StatementExecution.execute(DseQueryHandler.java:457) [dse-core-5.1.1.jar:5.1.1]
    at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeWithTiming(DseQueryHandler.java:369) [dse-core-5.1.1.jar:5.1.1]
    at com.datastax.bdp.cassandra.cql3.DseQueryHandler$Operation.executeAndMaybeWriteToAuditLog(DseQueryHandler.java:420) [dse-core-5.1.1.jar:5.1.1]
    at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:157) [dse-core-5.1.1.jar:5.1.1]
    at com.datastax.bdp.cassandra.cql3.DseQueryHandler.process(DseQueryHandler.java:109) [dse-core-5.1.1.jar:5.1.1]
    at org.apache.cassandra.transport.messages.QueryMessage.execute(QueryMessage.java:112) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:546) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at org.apache.cassandra.transport.Message$Dispatcher.channelRead0(Message.java:440) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-all-4.0.42.Final.jar:4.0.42.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) [netty-all-4.0.42.Final.jar:4.0.42.Final]
    at io.netty.channel.AbstractChannelHandlerContext.access$600(AbstractChannelHandlerContext.java:36) [netty-all-4.0.42.Final.jar:4.0.42.Final]
    at io.netty.channel.AbstractChannelHandlerContext$7.run(AbstractChannelHandlerContext.java:358) [netty-all-4.0.42.Final.jar:4.0.42.Final]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
    at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$FutureTask.run(AbstractLocalAwareExecutorService.java:162) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at org.apache.cassandra.concurrent.SEPWorker.run(SEPWorker.java:109) [cassandra-all-3.10.0.1695.jar:3.10.0.1695]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
Caused by: java.lang.StackOverflowError: null
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:281) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]
    at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_131]
    at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_131]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.<init>(SerializableSchema.java:284) ~[dse-graph-5.1.1.jar:5.1.1]
    at com.datastax.bdp.graph.spark.SerializableSchema$Property.lambda$new$375(SerializableSchema.java:283) ~[dse-graph-5.1.1.jar:5.1.1]
    at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_131]
    at java.util.Iterator.forEachRemaining(Iterator.java:116) ~[na:1.8.0_131]
    at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_131]
    at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_131]

    ...

I don't have much more information to provide. The graph name is correct, my configuration seems to be correct.

My schema is big (I cannot provide it publicly unfortunately), so may be this explains the error, but the logs are not very clear.

What could I do to run DSEGraphFrame properly?

1

1 Answers

0
votes

That is issue in DseGraphFrame. If the property is a meta property of it self or with some cycle, StackOverflow happens. It will be fixed in next releases. You can remove that recursion in the schema as a workaround Minimal example to reproduce:

system.graph('rec').create()
:remote config alias g rec.g

schema.propertyKey("name").Text().create();
schema.propertyKey("name").properties("name").add();