```d
interface IOpt(T)
{
  T value();
  bool empty();
  bool opEquals(IOpt!T other);
}

class None(T) : IOpt!T
{
  bool empty() => true;
  T value(){ throw new Exception("None has not a value"); }
  bool opEquals(IOpt!T other)=>other.empty;
}

class Some(T) : IOpt!T
{
  this(T value)  { this._value = value; }
  bool empty() => false;
  T value()=> _value;
bool opEquals(IOpt!T other)=>!other.empty && other.value==_value;

  private T _value;
}

IOpt!T some(T)(T v)=>new Some!T(v);
IOpt!T none(T)()=>new None!T;

void main()
{
  assert(new Some!int(1) == new Some!int(1));
  assert(new None!int == new None!int);
  assert(none!int.opEquals(none!int));
  assert(none!int == none!int);
}
```

It compiles, but last assertion ```assert(none!int == none!int);``` fails

```
core.exception.AssertError@testiface.d(33): Assertion failure
```

To avoid "extrange effects" I test an alternative equality that fails too:

```d
assert( (cast (IOpt!int) new None!int) == (cast (IOpt!int) new None!int));
```

What seems strange to me is that ```none!int.opEquals(none!int)``` works properly.

**Questions**

* Why, when applied to interface, ```opEquals``` called directly behavior is not the same that when calling ```==``` ?

* Is it the expected behaviour?

Reply via email to