Hi,
Hmmm… I build on a Mac. I wonder whats up with:
.bash_profile.macports-saved_2015-10-30_at_21:51:10
That seems like a really weird file name :/.
Can you @Ignore that test and see if anything else fails for you? Perhaps there
is a general pattern that pops up.
Thanks,
Marko.
http://markorodriguez.com
On Jan 6, 2016, at 4:29 PM, Jonathan Ellithorpe <[email protected]> wrote:
> Hi Marko,
>
> I'm actually trying to run all the unit tests. I'd like to start from a
> point that passes all the unit tests so that after I make my changes I can
> confirm that it didn't break anything.
>
> I tried to see if compiling on my mac would circumvent the NFS issue, but
> now I get this instead:
>
> shouldSupportHDFSMethods(org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPluginCheck)
> Time elapsed: 0.502 sec <<< ERROR!
> java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative
> path in absolute URI: .bash_profile.macports-saved_2015-10-30_at_21:51:10
> at java.net.URI.checkPath(URI.java:1823)
> at java.net.URI.<init>(URI.java:745)
> at org.apache.hadoop.fs.Path.initialize(Path.java:202)
> at org.apache.hadoop.fs.Path.<init>(Path.java:171)
> at org.apache.hadoop.fs.Path.<init>(Path.java:93)
> at org.apache.hadoop.fs.Globber.glob(Globber.java:241)
> at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1655)
> at
> org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
>
> at
> org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopLoader$_load_closure2.doCall(HadoopLoader.groovy:58)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:497)
> at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
> at
> org.codehaus.groovy.runtime.metaclass.ClosureMetaMethod.invoke(ClosureMetaMethod.java:81)
>
> at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
> at
> org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.invoke(PojoMetaMethodSite.java:48)
>
> at
> org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:53)
>
> at
> org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
>
> at
> org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
>
> at
> org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:114)
>
> at groovysh_evaluate.run(groovysh_evaluate:3)
> at
> org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
>
> at
> org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:69)
> at
> org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
>
> at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:185)
> at
> org.apache.tinkerpop.gremlin.groovy.util.TestableConsolePluginAcceptor.eval(TestableConsolePluginAcceptor.java:67)
>
> at
> org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPluginCheck.shouldSupportHDFSMethods(HadoopGremlinPluginCheck.java:108)
>
> On Wed, Jan 6, 2016 at 6:00 AM Marko Rodriguez <[email protected]> wrote:
>
>> Hi,
>>
>> As a short term fix, are you wanting to run tests or just build the
>> project? If the latter, then just do "mvn clean install
>> -Dmaven.test.skip=true"
>>
>> HTH,
>> Marko.
>>
>> http://markorodriguez.com
>>
>> On Jan 6, 2016, at 6:05 AM, Stephen Mallette <[email protected]> wrote:
>>
>>> I don't know what the problem could be here. I'm not aware of others
>>> having problems building master atm.
>>>
>>> On Mon, Jan 4, 2016 at 11:41 PM, Jonathan Ellithorpe <
>> [email protected]>
>>> wrote:
>>>
>>>> Just tried that, but I'm still getting this "Output directory
>>>> target/test-output/~traversers already exists" error above.
>>>>
>>>> I noticed the following in the logs:
>>>>
>>>> [INFO] Slf4jLogger$anonfun$receive$1$anonfun$applyOrElse$3 - Shutting
>> down
>>>> remote daemon.
>>>> [INFO] Slf4jLogger$anonfun$receive$1$anonfun$applyOrElse$3 - Remote
>> daemon
>>>> shut down; proceeding with flushing remote transports.
>>>> [INFO] Slf4jLogger$anonfun$receive$1$anonfun$applyOrElse$3 - Remoting
>> shut
>>>> down.
>>>> [WARN] HadoopGremlinPlugin - Be sure to set the environmental variable:
>>>> HADOOP_GREMLIN_LIBS
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/m]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b4000000c2]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b2000000c6]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b0000000bf]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b5000000c5]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b3000000c4]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b6000000c3]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612b1000000c1]:
>>>> it still exists.
>>>> [WARN] FileUtil - Failed to delete file or dir
>>>>
>>>>
>> [/home/jdellit/tmp/incubator-tinkerpop/spark-gremlin/target/test-output/~traversers/.nfs00000000066612af000000c0]:
>>>> it still exists.
>>>> [INFO] Logging$class - Running Spark version 1.5.1
>>>> [INFO] Logging$class - Changing view acls to: jdellit
>>>> [INFO] Logging$class - Changing modify acls to: jdellit
>>>>
>>>> The appearance of the .nfsXXX files leads me to suspect that files are
>>>> being deleted by one thread that are still open on another thread. If
>> that
>>>> other thread attempts a write then an .nfsXXX file is generated.
>>>>
>>>> Are other folks running these tests on NFS?
>>>>
>>>> Jonathan
>>>>
>>>>
>>>> On Mon, Jan 4, 2016 at 4:00 PM Stephen Mallette <[email protected]>
>>>> wrote:
>>>>
>>>>> how about good ol' mvn clean install? that works for me (and others)
>> and
>>>>> travis on master.
>>>>>
>>>>> i guess "mvn clean compile" and "mvn test" fails because gremlin-shaded
>>>>> doesn't get built as the core of its work is bound to the "package"
>>>> phase,
>>>>> so you get all those errors. it seems that you have to minimally
>> execute
>>>>> the "package" phase of the maven life cycle
>>>>>
>>>>> On Mon, Jan 4, 2016 at 4:47 PM, Jonathan Ellithorpe <
>> [email protected]
>>>>>
>>>>> wrote:
>>>>>
>>>>>> Hi Stephen, thanks for the heads up. I remember getting stuck trying
>> to
>>>>>> find a point in the tree from which to base my changes. Either there
>>>> was
>>>>> an
>>>>>> error compiling tinkerpop, or the unit tests did not pass. For example
>>>> on
>>>>>> tag 3.1.0-incubating I get the following error when executing mvn
>> clean
>>>>>> compile:
>>>>>>
>>>>>> [ERROR] COMPILATION ERROR :
>>>>>> [INFO] -------------------------------------------------------------
>>>>>> [ERROR]
>>>>>>
>>>>>>
>>>>>
>>>>
>> /home/jdellit/tmp/incubator-tinkerpop/gremlin-core/src/main/java/org/apache/tinkerpop/gremlin/structure/io/graphson/GraphSONReader.java:[41,53]
>>>>>> package org.apache.tinkerpop.shaded.jackson.core.type does not exist
>>>>>> .
>>>>>> .
>>>>>> .
>>>>>>
>>>>>> "mvn clean package -DskipTests" does work.
>>>>>>
>>>>>> Then when I run "mvn test" I get:
>>>>>>
>>>>>> shouldGetVersion(org.apache.tinkerpop.gremlin.util.GremlinTest) Time
>>>>>> elapsed: 0.016 sec <<< ERROR!
>>>>>> java.lang.ExceptionInInitializerError: null
>>>>>> at com.jcabi.manifests.Manifests.read(Manifests.java:274)
>>>>>> at org.apache.tinkerpop.gremlin.util.Gremlin.<clinit>(Gremlin.java:32)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.tinkerpop.gremlin.util.GremlinTest.shouldGetVersion(GremlinTest.java:39)
>>>>>>
>>>>>> And several other errors while running tests in Gremlin Core.
>>>> Strangely,
>>>>>> however, running "mvn package" at this point does not produce those
>>>>> errors,
>>>>>> even though its running the same tests. It encounters a different
>> error
>>>>> for
>>>>>> Spark Gremlin:
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>> shouldGracefullyHandleBadGremlinHadoopLibs(org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPluginTest)
>>>>>> Time elapsed: 3.318 sec <<< ERROR!
>>>>>>
>>>>>> org.apache.tinkerpop.gremlin.groovy.plugin.RemoteException:
>>>>>> java.util.concurrent.ExecutionException:
>>>>>> org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
>>>>>> target/test-output/~traversers already exists
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
>>>>>> at
>>>>> java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopRemoteAcceptor.submit(HadoopRemoteAcceptor.java:99)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.tinkerpop.gremlin.hadoop.groovy.plugin.HadoopGremlinPluginTest.shouldGracefullyHandleBadGremlinHadoopLibs(HadoopGremlinPluginTest.java:169)
>>>>>> Caused by: org.apache.hadoop.mapred.FileAlreadyExistsException: Output
>>>>>> directory target/test-output/~traversers already exists
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1011)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:998)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:998)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>>>>>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:998)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopFile$2.apply$mcV$sp(PairRDDFunctions.scala:938)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:930)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions$anonfun$saveAsNewAPIHadoopFile$2.apply(PairRDDFunctions.scala:930)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
>>>>>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopFile(PairRDDFunctions.scala:930)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopFile(JavaPairRDD.scala:809)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.tinkerpop.gremlin.spark.process.computer.SparkExecutor.saveMapReduceRDD(SparkExecutor.java:208)
>>>>>> at
>>>>>>
>>>>>>
>>>>>
>>>>
>> org.apache.tinkerpop.gremlin.spark.process.computer.SparkGraphComputer.lambda$submit$21(SparkGraphComputer.java:211)
>>>>>>
>>>>>> I am confused as to why mvn package results in the success of certain
>>>>> tests
>>>>>> that fail for mvn package.
>>>>>>
>>>>>> Jonathan
>>>>>>
>>>>>> On Mon, Jan 4, 2016 at 7:08 AM Stephen Mallette <[email protected]
>>>
>>>>>> wrote:
>>>>>>
>>>>>>> Jonathan, just wanted to throw a heads up your way so that you're
>>>> aware
>>>>>> of
>>>>>>> our expecting timing. If all goes as planned, we will head into code
>>>>>>> freeze for 3.1.1-incubating in about three weeks. If you are still
>>>>>>> planning on submitting PRs for:
>>>>>>>
>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP3-997
>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP3-998
>>>>>>>
>>>>>>> we'd need to see it in that time frame. I don't mean to apply
>>>>> pressure,
>>>>>> I
>>>>>>> just don't want to miss the chance to get these fixes in for
>>>>>>> 3.1.1-incubating.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Dec 9, 2015 at 1:24 AM, Jonathan Ellithorpe <
>>>>> [email protected]
>>>>>>>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi Stephen, working on that now, thanks for pinging me on this.
>>>>>>>>
>>>>>>>> On Tue, Dec 8, 2015 at 4:48 PM Stephen Mallette <
>>>>> [email protected]>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi Jonathan, just wondering if you still plan to look at offering
>>>>> PRs
>>>>>>>> for:
>>>>>>>>>
>>>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP-998
>>>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP-997
>>>>>>>>>
>>>>>>>>> I'll stay away from those, if you think you will be working on
>>>>> them.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Nov 30, 2015 at 1:39 PM, Stephen Mallette <
>>>>>>> [email protected]>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>>> Using VertexProperyFeatures.FEATURE_{ADD, REMOVE}_PROPERTY
>>>>>> perhaps
>>>>>>>>>> would be more consistent with the logic used everywhere else...
>>>>>>>>>>
>>>>>>>>>> yeah - i'm +1 for this approach. it makes more sense given
>>>>>> ADD/REMOVE
>>>>>>>>>> already being the pattern for graph Element instances.
>>>>>>>>>>
>>>>>>>>>> On Mon, Nov 30, 2015 at 1:31 PM, Jonathan Ellithorpe <
>>>>>>>>> [email protected]>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> I think it's either that or change FEATURE_META_PROPERTY to a
>>>>>>>> symmetric
>>>>>>>>>>> VertexFeatures.FEATURE_{ADD, REMOVE}_METAPROPERTY to pair with
>>>>>>>>>>> VertexFeatures.FEATURE_{ADD, REMOVE}_PROPERTY.
>>>>>>>>>>>
>>>>>>>>>>> Using VertexProperyFeatures.FEATURE_{ADD, REMOVE}_PROPERTY
>>>>> perhaps
>>>>>>>> would
>>>>>>>>>>> be
>>>>>>>>>>> more consistent with the logic used everywhere else...
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Mon, Nov 30, 2015 at 6:30 AM Stephen Mallette <
>>>>>>>> [email protected]>
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> ugh - mess. maybe we should just keep the add/remove
>>>> symmetry
>>>>>> and
>>>>>>>>>>>> deprecate FEATURE_META_PROPERTY then.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sat, Nov 28, 2015 at 4:11 PM, Jonathan Ellithorpe <
>>>>>>>>>>> [email protected]>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> 1) Yes, I can submit a PR for fixing the SIMPLE feature
>>>>>>>> requirement
>>>>>>>>>>> set.
>>>>>>>>>>>>> 2) I also agree with deprecating
>>>>>>>>>>>>> VertexPropertyFeatures.FEATURE_ADD_PROPERTY, but looking
>>>> at
>>>>>> the
>>>>>>>>> code I
>>>>>>>>>>>>> think I realized there is a slight complication here. That
>>>>> is,
>>>>>>>> what
>>>>>>>>>>> to do
>>>>>>>>>>>>> with VertexPropertyFeatures.FEATURE_REMOVE_PROPERTY. Does
>>>>>>>>>>>>> VertexFeatures.FEATURE_META_PROPERTIES imply both ADD and
>>>>>>> REMOVE,
>>>>>>>> or
>>>>>>>>>>> only
>>>>>>>>>>>>> ADD? In the later case, we would need to leave
>>>>>>>>>>>>> VertexPropertyFeatures.FEATURE_REMOVE_PROPERTIES.
>>>>> Personally,
>>>>>>>> seeing
>>>>>>>>>>> as
>>>>>>>>>>>> how
>>>>>>>>>>>>> VertexFeatures, extending ElementFeatures, has a
>>>>>>>>> FEATURE_ADD_PROPERTY
>>>>>>>>>>> and
>>>>>>>>>>>>> FEATURE_REMOVE_PROPERTY, that the FEATURE_META_PROPERTIES
>>>> be
>>>>>>>> changed
>>>>>>>>>>> to
>>>>>>>>>>>>> FEATURE_ADD_METAPROPERTY and FEATURE_REMOVE_METAPROPERTY.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Jonathan
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Fri, Nov 27, 2015 at 4:55 AM Stephen Mallette <
>>>>>>>>>>> [email protected]>
>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> ...damn - hot key sent my post too soon - trying again:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hi Jonathan, thanks for bringing this up. It would be
>>>>> nice
>>>>>> if
>>>>>>>> we
>>>>>>>>>>> could
>>>>>>>>>>>>>> expand coverage of our test suite by simply improving
>>>> the
>>>>>> way
>>>>>>> in
>>>>>>>>>>> which
>>>>>>>>>>>>>> features are applied. I was about to suggest a big set
>>>> of
>>>>>>>> changes
>>>>>>>>>>>> when I
>>>>>>>>>>>>>> realized that FeatureRequirementSet.SIMPLE is just
>>>> defined
>>>>>>>> wrong.
>>>>>>>>>>> It
>>>>>>>>>>>>>> shouldn't have this entry:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>> addFeatureRequirement.Factory.create(Graph.Features.VertexPropertyFeatures.FEATURE_ADD_PROPERTY,
>>>>>>>>>>>>>> Graph.Features.VertexPropertyFeatures.class));
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> it should just be:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>> add(FeatureRequirement.Factory.create(Graph.Features.VertexFeatures.FEATURE_ADD_PROPERTY,
>>>>>>>>>>>>>> Graph.Features.VertexFeatures.class));
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I've created an issue for that to track things:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP3-997
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> because it is a "breaking" change as it will open up
>>>> tests
>>>>>> and
>>>>>>>>>>> possibly
>>>>>>>>>>>>>> cause existing implementations to fail. If you'd like
>>>> to
>>>>>>>> submit a
>>>>>>>>>>> PR
>>>>>>>>>>>> for
>>>>>>>>>>>>>> this little fix, as you were the reporter for it and as
>>>>>>> someone
>>>>>>>>> who
>>>>>>>>>>> can
>>>>>>>>>>>>>> test it in a way that is currently failing for them,
>>>> just
>>>>>> let
>>>>>>> me
>>>>>>>>>>> know.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> As for the this issue:
>>>>>>>>>>>>>>
>>>> Graph.Features.VertexPropertyFeatures.FEATURE_ADD_PROPERTY
>>>>>>>>>>>>>> <==>
>>>>> Graph.Features.VertexFeatures.FEATURE_META_PROPERTIES -
>>>>>>>> yeah
>>>>>>>>> -
>>>>>>>>>>> we
>>>>>>>>>>>>> need
>>>>>>>>>>>>>> to deprecate one of those as they are the same thing.
>>>> Not
>>>>>>> sure
>>>>>>>> if
>>>>>>>>>>>> anyone
>>>>>>>>>>>>>> has any preferences on that. in one sense,
>>>>>>> FEATURE_ADD_PROPERTY
>>>>>>>>> is
>>>>>>>>>>>>> better
>>>>>>>>>>>>>> because it matches the approach for Vertex/Edge.
>>>>>>>>>>>>>
>>>>>>>>>>>>> On the other hand, the
>>>>>>>>>>>>>> documentation refers to this feature as
>>>> "meta-properties".
>>>>>> I
>>>>>>>>> guess
>>>>>>>>>>> i
>>>>>>>>>>>>> would
>>>>>>>>>>>>>> go with keeping FEATURE_META_PROPERTIES and deprecating
>>>>>>>>>>>>>> FEATURE_ADD_PROPERTY. I've created an issue as such:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> https://issues.apache.org/jira/browse/TINKERPOP3-998
>>>>>>>>>>>>>
>>>>>>>>>>>>> If no one has any objections in the next 72 hours (Monday,
>>>>>>>> November
>>>>>>>>>>> 30,
>>>>>>>>>>>>>> 2015 at 7:45am) I'll assume lazy consensus and we can
>>>> move
>>>>>>>> forward
>>>>>>>>>>> with
>>>>>>>>>>>>>> this one.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Fri, Nov 27, 2015 at 7:35 AM, Stephen Mallette <
>>>>>>>>>>>> [email protected]>
>>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi Jonathan, thanks for bringing this up. It would be
>>>>>> nice
>>>>>>> if
>>>>>>>>> we
>>>>>>>>>>>> could
>>>>>>>>>>>>>>> expand coverage of our test suite by simply improving
>>>>> the
>>>>>>> way
>>>>>>>> in
>>>>>>>>>>>> which
>>>>>>>>>>>>>>> features are applied. I was about to suggest a big
>>>> set
>>>>> of
>>>>>>>>> changes
>>>>>>>>>>>>> when I
>>>>>>>>>>>>>>> realized that FeatureRequirementSet.SIMPLE is just
>>>>> defined
>>>>>>>>>>> wrong. It
>>>>>>>>>>>>>>> shouldn't have
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>> addFeatureRequirement.Factory.create(Graph.Features.VertexPropertyFeatures.FEATURE_ADD_PROPERTY,
>>>>>>>>>>>>>>> Graph.Features.VertexPropertyFeatures.class));
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> it should just be:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Mon, Nov 23, 2015 at 7:39 PM, Jonathan Ellithorpe <
>>>>>>>>>>>>>> [email protected]>
>>>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Hello all,
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I am currently working on an experimental
>>>>> implementation
>>>>>> of
>>>>>>>>>>>> TinkerPop3
>>>>>>>>>>>>>> on
>>>>>>>>>>>>>>>> an in-memory key-value store called RAMCloud. In the
>>>>>>> process
>>>>>>>> of
>>>>>>>>>>>>> running
>>>>>>>>>>>>>>>> the
>>>>>>>>>>>>>>>> unit tests I noticed that turning on support for
>>>>>>> persistence
>>>>>>>>> did
>>>>>>>>>>> not
>>>>>>>>>>>>>>>> trigger any new unit tests in GraphTests. Looking
>>>> into
>>>>>> the
>>>>>>>>>>> matter, I
>>>>>>>>>>>>>> found
>>>>>>>>>>>>>>>> that the unit test that tests this,
>>>>> shouldPersistOnClose,
>>>>>>> was
>>>>>>>>> not
>>>>>>>>>>>>>>>> executing
>>>>>>>>>>>>>>>> because meta properties support is included in its
>>>>>> feature
>>>>>>>>>>>>> requirements,
>>>>>>>>>>>>>>>> but I do not have support for meta properties. Oddly,
>>>>>>> though,
>>>>>>>>>>> this
>>>>>>>>>>>>>>>> features
>>>>>>>>>>>>>>>> requirement seems to be superfluous, since the test
>>>>> does
>>>>>>> not
>>>>>>>>>>> utilize
>>>>>>>>>>>>>> meta
>>>>>>>>>>>>>>>> properties.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> An orthogonal issue seems to be that
>>>>>>>>>>>>>>>>
>>>>>> Graph.Features.VertexPropertyFeatures.FEATURE_ADD_PROPERTY
>>>>>>>> <==>
>>>>>>>>>>>>>>>> Graph.Features.VertexFeatures.FEATURE_META_PROPERTIES
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Best,
>>>>>>>>>>>>>>>> Jonathan
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>
>>