Re: [swift-evolution] JSONEncoder: Key strategies

2017-11-09 Thread Itai Ferber via swift-evolution

Hi Brent,

Perhaps the wording would be better phrased as "boundary from 
non-uppercase-character to uppercase-character", i.e. numbers and Emoji 
are treated the same as lowercase characters and are included in the 
original word.
The following are [unit test cases from the associated 
PR](https://github.com/apple/swift/pull/12779/files#diff-26b09c16508c21f9f59dcf6c7a41d4b4R422), 
which should indicate the behavior implemented here:


```swift
let toSnakeCaseTests = [
  ("simpleOneTwo", "simple_one_two"),
  ("myURL", "my_url"),
  ("singleCharacterAtEndX", "single_character_at_end_x"),
  ("thisIsAnXMLProperty", "this_is_an_xml_property"),
  ("single", "single"), // no underscore
  ("", ""), // don't die on empty string
  ("a", "a"), // single character
  ("aA", "a_a"), // two characters
  ("version4Thing", "version4_thing"), // numerics
  ("partCAPS", "part_caps"), // only insert underscore before first all 
caps
  ("partCAPSLowerAGAIN", "part_caps_lower_again"), // switch back and 
forth caps.
  ("manyWordsInThisThing", "many_words_in_this_thing"), // simple 
lowercase underscore more

  ("asdfĆqer", "asdf_ćqer"),
  ("already_snake_case", "already_snake_case"),
  ("dataPoint22", "data_point22"),
  ("dataPoint22Word", "data_point22_word"),
  ("_oneTwoThree", "_one_two_three"),
  ("oneTwoThree_", "one_two_three_"),
  ("__oneTwoThree", "__one_two_three"),
  ("oneTwoThree__", "one_two_three__"),
  ("_oneTwoThree_", "_one_two_three_"),
  ("__oneTwoThree", "__one_two_three"),
  ("__oneTwoThree__", "__one_two_three__"),
  ("_test", "_test"),
  ("_test_", "_test_"),
  ("__test", "__test"),
  ("test__", "test__"),
  ("m͉̟̹y̦̳G͍͚͎̳r̤͉̤͕ͅea̲͕t͇̥̼͖U͇̝̠R͙̻̥͓̣L̥̖͎͓̪̫ͅR̩͖̩eq͈͓u̞e̱s̙t̤̺ͅ", 
"m͉̟̹y̦̳_g͍͚͎̳r̤͉̤͕ͅea̲͕t͇̥̼͖_u͇̝̠r͙̻̥͓̣l̥̖͎͓̪̫ͅ_r̩͖̩eq͈͓u̞e̱s̙t̤̺ͅ"), 
// because Itai wanted to test this

  ("🐧🐟", "🐧🐟") // fishy emoji example?
]
```

And for completeness, the [complementary test 
cases](https://github.com/apple/swift/pull/12779/files#diff-26b09c16508c21f9f59dcf6c7a41d4b4R540):


```swift
let fromSnakeCaseTests = [
  ("", ""), // don't die on empty string
  ("a", "a"), // single character
  ("ALLCAPS", "ALLCAPS"), // If no underscores, we leave the word as-is
  ("ALL_CAPS", "allCaps"), // Conversion from screaming snake case
  ("single", "single"), // do not capitalize anything with no 
underscore

  ("snake_case", "snakeCase"), // capitalize a character
  ("one_two_three", "oneTwoThree"), // more than one word
  ("one_2_three", "one2Three"), // numerics
  ("one2_three", "one2Three"), // numerics, part 2
  ("snake_Ćase", "snakeĆase"), // do not further modify a capitalized 
diacritic

  ("snake_ćase", "snakeĆase"), // capitalize a diacritic
  ("alreadyCamelCase", "alreadyCamelCase"), // do not modify already 
camel case

  ("__this_and_that", "__thisAndThat"),
  ("_this_and_that", "_thisAndThat"),
  ("this__and__that", "thisAndThat"),
  ("this_and_that__", "thisAndThat__"),
  ("this_aNd_that", "thisAndThat"),
  ("_one_two_three", "_oneTwoThree"),
  ("one_two_three_", "oneTwoThree_"),
  ("__one_two_three", "__oneTwoThree"),
  ("one_two_three__", "oneTwoThree__"),
  ("_one_two_three_", "_oneTwoThree_"),
  ("__one_two_three", "__oneTwoThree"),
  ("__one_two_three__", "__oneTwoThree__"),
  ("_test", "_test"),
  ("_test_", "_test_"),
  ("__test", "__test"),
  ("test__", "test__"),
  ("_", "_"),
  ("__", "__"),
  ("___", "___"),
  ("m͉̟̹y̦̳G͍͚͎̳r̤͉̤͕ͅea̲͕t͇̥̼͖U͇̝̠R͙̻̥͓̣L̥̖͎͓̪̫ͅR̩͖̩eq͈͓u̞e̱s̙t̤̺ͅ", 
"m͉̟̹y̦̳G͍͚͎̳r̤͉̤͕ͅea̲͕t͇̥̼͖U͇̝̠R͙̻̥͓̣L̥̖͎͓̪̫ͅR̩͖̩eq͈͓u̞e̱s̙t̤̺ͅ"), 
// because Itai wanted to test this

  ("🐧_🐟", "🐧🐟") // fishy emoji example?
]
```

— Itai

On 9 Nov 2017, at 5:57, Brent Royal-Gordon via swift-evolution wrote:

> On Nov 6, 2017, at 12:54 PM, Tony Parker via swift-evolution 
 wrote:


Converting from camel case to snake case:

1. Splits words at the boundary of lower-case to upper-case
2. Inserts `_` between words
3. Lowercases the entire string
4. Preserves starting and ending `_`.

For example, `oneTwoThree` becomes `one_two_three`. `_oneTwoThree_` 
becomes `_one_two_three_`.


My first thought was "are you handling `valueAsHTML` correctly?", but 
it looks like you are with the "boundary of lower-case to upper-case" 
wording. But what do you plan to do for numbers? Characters in 
caseless scripts? Emoji (which are valid in Swift identifiers)? I 
don't necessarily have strong opinions about the right answer—just 
want to make sure you do *something* about it.


--
Brent Royal-Gordon
Architechies




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] JSONEncoder: Key strategies

2017-11-07 Thread Itai Ferber via swift-evolution

Hi Norio,

There are two reasons that I think this is valuable over doing something 
in `CodingKeys`:


1. The definition you give your coding keys affects all encoding 
formats. JSON is a format where snake_case can be relatively common, so 
the transformation makes a lot of sense there. For other formats, like 
plist files or otherwise, the transformation might not make as much 
sense. Instead of affecting all of your coding keys globally, this 
limits it to JSON.
2. More importantly, this allows you to transform keys of things which 
you don’t necessarily own. If you’re working with types that you 
didn’t write (but which are expected to have snake_case keys 
nonetheless), this allows you to perform that transformation. If this 
were instead an annotation on `CodingKeys` directly, you wouldn’t be 
able to perform it on types you don’t directly own.


— Itai

On 6 Nov 2017, at 17:39, Norio Nomura via swift-evolution wrote:


Hi Tony,

Is it better for us to choose on `Codable` side whether `rawValue` of
`CodingKeys` should be generated with snake_case?
It seems to be more consistent with the current method of setting
`rawValue` of `CodingKeys` on `Codable` side.

Thanks,
--
@norio_nomura

2017-11-07 5:54 GMT+09:00 Tony Parker via swift-evolution <
swift-evolution@swift.org>:


Hi everyone,

While we have no formal process at this time for proposals of changes 
to

Foundation-only code, I would still like to post one that we have run
through our internal process here for additional public comment.

Link to PR with proposal content:

https://github.com/apple/swift-corelibs-foundation/pull/1301

Link to implementation for the overlay:

https://github.com/apple/swift/pull/12779

Markdown follows.

Thanks,
- Tony

# Key Strategies for JSONEncoder and JSONDecoder

* Proposal: SCLF-0001
* Author(s): Tony Parker 

# Related radars or Swift bugs

*  Snake case / Camel case conversions for
JSONEncoder/Decoder

# Revision history

* **v1** Initial version

## Introduction

While early feedback for `JSONEncoder` and `JSONDecoder` has been 
very

positive, many developers have told us that they would appreciate a
convenience for converting between `snake_case_keys` and 
`camelCaseKeys`

without having to manually specify the key values for all types.

## Proposed solution

`JSONEncoder` and `JSONDecoder` will gain new strategy properties to 
allow

for conversion of keys during encoding and decoding.

```swift
class JSONDecoder {
/// The strategy to use for automatically changing the value of 
keys

before decoding.
public enum KeyDecodingStrategy {
/// Use the keys specified by each type. This is the default
strategy.
case useDefaultKeys

/// Convert from "snake_case_keys" to "camelCaseKeys" before
attempting to match a key with the one specified by each type.
///
/// The conversion to upper case uses `Locale.system`, also 
known
as the ICU "root" locale. This means the result is consistent 
regardless of

the current user's locale and language preferences.
///
/// Converting from snake case to camel case:
/// 1. Capitalizes the word starting after each `_`
/// 2. Removes all `_`
/// 3. Preserves starting and ending `_` (as these are often 
used

to indicate private variables or other metadata).
/// For example, `one_two_three` becomes `oneTwoThree`.
`_one_two_three_` becomes `_oneTwoThree_`.
///
/// - Note: Using a key decoding strategy has a nominal
performance cost, as each string key has to be inspected for the `_`
character.
case convertFromSnakeCase

/// Provide a custom conversion from the key in the encoded 
JSON

to the keys specified by the decoded types.
/// The full path to the current decoding position is 
provided for

context (in case you need to locate this key within the payload). The
returned key is used in place of the last component in the coding 
path

before decoding.
case custom(([CodingKey]) -> CodingKey)
}

/// The strategy to use for decoding keys. Defaults to
`.useDefaultKeys`.
open var keyDecodingStrategy: KeyDecodingStrategy = 
.useDefaultKeys

}

class JSONEncoder {
/// The strategy to use for automatically changing the value of 
keys

before encoding.
public enum KeyEncodingStrategy {
/// Use the keys specified by each type. This is the default
strategy.
case useDefaultKeys

/// Convert from "camelCaseKeys" to "snake_case_keys" before
writing a key to JSON payload.
///
/// Capital characters are determined by testing membership 
in

`CharacterSet.uppercaseLetters` and `CharacterSet.lowercaseLetters`
(Unicode General Categories Lu and Lt).
/// The conversion to lower case uses `Locale.system`, also 
known
as the ICU "root" locale. This means the result is consistent 
regardless of

the current user's locale and language preferences.
///
/// Converting fr

Re: [swift-evolution] [Pitch] DateComponents{Encoding/Decoding}Strategy in JSON{Encoder/Decoder}

2017-09-08 Thread Itai Ferber via swift-evolution

Hi Pitiphong,

Thanks for taking the time and energy to pitch this, too! If we can find 
a good solution for matching this up with ISO 8601, and we have high 
demand for this feature, I think it will be worth reconsidering again in 
the future.

Thanks for the input!

— Itai

On 8 Sep 2017, at 13:00, Pitiphong Phongpattranont wrote:


Hi Itai,

As I told you in my last email that I’m thinking about the ISO 8601 
case. After thinking about that, having a discussion in the Swift 
Evolution and reading your emails, I think it may not worth to add 
this into Swift Standard Library. I think the use case is not that 
much so it’s not worth the cost of maintenance alone not to mention 
or think about how to implement it properly (if we choose to do and 
support the `iso8601` strategy.


I think I will close this pitch and would like to thank you for 
reviewing and discussing on this.



— Pitiphong P.


On 7 Sep BE 2560, at 01:03, Itai Ferber  wrote:

Hi Pitiphong,

Don’t worry — your original email was clear, and we are on the 
same page about Date{En,De}codingStrategy and 
DateComponents{En,De}codingStrategy being separate things.
To clarify my points, though, there are two main things I want to 
say:


I think there is a mismatch here between your goal of representing 
the components of a date (and what DateComponents can specifically 
hold) and the goal of ISO 8601
I think that there is an inherent problem in parsing DateComponents 
due to ambiguity
I think both of these issues can be solved by reading and writing a 
Date (formatted however you need it to be) instead of DateComponents.


To elaborate:

DateComponents is meant to be a container for an arbitrary subset of 
information about a Date. A Date represents a specific instant in 
time, but DateComponents are effectively meaningless without 
additional context. In the examples that you give, it’s possible to 
represent the concepts at hand with DateComponents, but in order to 
make those components actionable and meaningful, you still need to 
convert them to Dates. Note also that:


It’s entirely possible to create a DateComponents which represents 
a date which does not exist, or a time which does not exist
Any of these concepts can also be represented by a Date instead of 
just components; e.g., an all-day event can be represented by a Date 
that represents the beginning of the day (00:00:00) and a flag that 
indicates that the time of the event can be ignored, or by a start 
Date that represents the start of the day and and end Date that 
represents the end of the day
Unlike DateComponents, ISO 8601 strings have some structure to them. 
They cannot represent just a time zone, for instance, or some 
singular components of a date/time (e.g. a month without a year, a 
day without a month and year, a minute without an hour, a second 
without a minute and hour, etc.). I think this is a relatively large 
conceptual mismatch that is worth considering deeply. There are a lot 
of DateComponents instances which simply cannot be represented by an 
ISO 8601 string


There is also the issue of decoding arbitrary ISO 8601 strings into 
DateComponents. DateComponents, having no structure at all, have no 
specified format they can expect to decode from, and ISO 8601 does 
not always provide that structure. Consider the following example:


ISO 8601 allows for date representations by year, month, and day 
(-MM-DD), among other forms. But it also allows days to be left 
unspecified (-MM), and even months ()
Similarly, it allows for a time representations by hour, minute, and 
second (hh:mm:ss), but also just hour and minute (hh:mm), and just 
hour (hh). Importantly, it allows time separators to be omitted 
(hhmmss, hhmm, hh)
Consider then, attempting to parse the string "2017" without any 
context — what DateComponents should be read out? Intuitively, 2017 
looks like a year (), but it is equally valid to parse as the 
time 20:17 (hhmm). Without knowing the expected format, parsing is 
ambiguous
We cannot promise to parse DateComponents in all cases because there 
are many combinations of strings that are just completely ambiguous.


So, to get at the core of this — if there is a specific format that 
you would like to encode to and from, why not do so with a Date and a 
DateFormatter (or if you need ISO 8601 specifically, 
ISO8601DateFormatter)? With a formatter, the format is unambiguous 
because you explicitly provide it, and there is nothing the date 
can’t represent that DateComponents can. You can always parse the 
date and pull out only those components that you care about. You also 
mention interoperability with an external JSON source — how is that 
source producing a string/parsing one back? [What I’m getting at 
here is: what is the value of adding a new, potentially risky 
strategy over existing methods that might work just as well, or 
better?]


And lastly, if .iso8601 is not necessarily a good fit for this 
strategy, what separates .cu

Re: [swift-evolution] [Proposal] Explicit Synthetic Behaviour

2017-09-08 Thread Itai Ferber via swift-evolution


> On Sep 8, 2017, at 12:46 AM, Haravikk via swift-evolution 
>  wrote:
> 
> 
>> On 7 Sep 2017, at 22:02, Itai Ferber > > wrote:
>> 
>> protocol Fooable : Equatable { // Equatable is just a simple example
>> var myFoo: Int { get }
>> }
>> 
>> extension Fooable {
>> static func ==(_ lhs: Self, _ rhs: Self) -> Bool {
>> return lhs.myFoo == rhs.myFoo
>> }
>> }
>> 
>> struct X : Fooable {
>> let myFoo: Int
>> let myName: String
>> // Whoops, forgot to give an implementation of ==
>> }
>> 
>> print(X(myFoo: 42, myName: "Alice") == X(myFoo: 42, myName: "Bob")) // true
>> This property is necessary, but not sufficient to provide a correct 
>> implementation. A default implementation might be able to assume something 
>> about the types that it defines, but it does not necessarily know enough.
> 
> Sorry but that's a bit of a contrived example; in this case the protocol 
> should not implement the equality operator if more information may be 
> required to define equality. It should only be implemented if the protocol is 
> absolutely clear that .myFoo is the only part of a Fooable that can or should 
> be compared as equatable, e.g- if a Fooable is a database record and .myFoo 
> is a primary key, the data could differ but it would still be a reference to 
> the same record.
> 
> To be clear, I'm not arguing that someone can't create a regular default 
> implementation that also makes flawed assumptions, but that 
> synthesised/reflective implementations by their very nature have to, as they 
> cannot under every circumstance guarantee correctness when using parts of a 
> concrete type that they know nothing about.
You can’t argue this both ways:
If you’re arguing this on principle, that in order for synthesized 
implementations to be correct, they must be able to — under every circumstance 
— guarantee correctness, then you have to apply the same reasoning to default 
protocol implementations. Given a default protocol implementation, it is 
possible to come up with a (no matter how contrived) case where the default 
implementation is wrong. Since you’re arguing this on principle, you cannot 
reject contrived examples.
If you are arguing this in practice, then you’re going to have to back up your 
argument with evidence that synthesized examples are more often wrong than 
default implementations. You can’t declare that synthesized implementations are 
by nature incorrect but allow default implementations to slide because in 
practice, many implementations are allowable. There’s a reason why synthesis 
passed code review and was accepted: in the majority of cases, synthesis was 
deemed to be beneficial, and would provide correct behavior. If you are 
willing to say that yes, sometimes default implementations are wrong but 
overall they’re correct, you’re going to have to provide hard evidence to back 
up the opposite case for synthesized implementations. You stated in a previous 
email that "A synthesised/reflective implementation however may return a result 
that is simply incorrect, because it is based on assumptions made by the 
protocol developer, with no input from the developer of the concrete type. In 
this case the developer must override it in to provide correct behaviour." — if 
you can back this up with evidence (say, taking a survey of a large number of 
model types and see if in the majority of cases synthesized implementation 
would be incorrect) to provide a compelling argument, then this is something 
that we should in that case reconsider.

>>> Reflective/synthesised default implementations must by their very nature 
>>> make assumptions about a concrete type that are not cannot be guaranteed to 
>>> be correct. The properties and methods they may end up interacting withmay 
>>> have nothing at all to do with the protocol. Equatable remains by far the 
>>> simplest example; just because a developer has used equatable properties 
>>> does not guarantee that all of them should be compared during a check for 
>>> equality.
>> In the same way that you might consider synthesized conformances to 
>> overreach into a type and touch things which are not related to a protocol, 
>> default implementations can be considered underreach in that they don’t know 
>> anything about properties which are necessary for providing a correct 
>> implementation.
> 
> If more information is necessary to provide a correct implementation, then a 
> default implementation shouldn't be provided. This is what unimplemented 
> properties and methods are for; either getting the developer to provide the 
> missing information, or getting them to implement the correct behaviour.
I agree, but you can’t selectively argue this.

> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution

___
swift-evolution mailing list
swi

Re: [swift-evolution] [Proposal] Explicit Synthetic Behaviour

2017-09-07 Thread Itai Ferber via swift-evolution


> On Sep 7, 2017, at 1:43 PM, Haravikk via swift-evolution 
>  wrote:
> 
>> 
>> On 7 Sep 2017, at 19:36, Tony Allevato > > wrote:
>> 
>> 
>> 
>> On Thu, Sep 7, 2017 at 11:18 AM Haravikk via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>>> On 7 Sep 2017, at 18:53, Tony Allevato via swift-evolution 
>>> mailto:swift-evolution@swift.org>> wrote:
>>> 
>>> 
>>> 
>>> On Thu, Sep 7, 2017 at 10:39 AM Gwendal Roué >> > wrote:
 Le 7 sept. 2017 à 14:45, Tony Allevato >>> > a écrit :
 
 Right, let's make sure we're talking about the right thing here. Gwendal, 
 your issue isn't with synthesis in the form of Codable or the new 
 additions to Equatable/Hashable which are opt-in-by-conformance, it's with 
 the specific case of raw value enums or enums without associated values 
 where the synthesis is implicit with no way to opt-out. That's a big 
 difference.
>>> 
>>> Yes.
>>> 
 I can definitely see the latter being an issue if it were more widespread, 
 and I'd be supportive of those enums being required to declare their 
 conformance for consistency (though it would be source breaking).
>>> 
>>> Yes, unfortunately.
>>> 
>>> 
 However, I still haven't seen a real issue that has come up because of the 
 distinction being drawn here between default implementations vs. 
 implementations that can access other parts of the concrete type. It 
 sounds like this discussion is trying to protect against a hypothetical 
 problem that hasn't happened yet and may not happen; it would be helpful 
 to show some motivating real-world cases where this is indeed a severe 
 problem.
>>> 
>>> Yes. I'm not talking about implementation itself. I know this has been the 
>>> main topic until I have tried to bring in the topic of the consequences of 
>>> non-avoidable synthesis (extra methods that may conflict with userland 
>>> methods).
>>> 
>>> If you ask me for a real-world case, then I think I gave one. Let me 
>>> rephrase it:
>>> 
>>> it's impossible to define a value-backed enum without getting free 
>>> Equatable conformance. This free conformance is sometimes unwanted, and I 
>>> gave the example of DSLs. Now this problem is not *severe*. It's more a 
>>> blind spot in the language, and finally just an unwanted side-effect of a 
>>> compiler convenience,
>>> 
>>> Again, this is not the issue that Haravikk is describing in this thread.
>>> 
>>> I'll clarify—your issue is specifically with the fact that enums with raw 
>>> values and enums without associated values receive Equatable even without 
>>> explicitly conforming to it, and therefore users have no way of opting out 
>>> of it. This predates SE-0185, and I didn't propose making any changes to 
>>> the conformance of those enums for source compatibility reasons, though I 
>>> wouldn't be opposed to it because it makes them consistent across the board.
>>> 
>>> Haravikk's argument is about synthesized conformances like Codable and 
>>> Equatable/Hashable in SE-0185, where the user must explicitly conform the 
>>> type to those protocols. His claim is that that act of opting in is not 
>>> sufficient and that it is still dangerous if those synthesized conformances 
>>> can access members that are not also declared in the protocol. That's a 
>>> completely separate issue to yours, and one that I hope he'll present more 
>>> evidence of. Right now, requiring that you not only explicitly conform to 
>>> the protocol but also explicitly request the synthesis feels like a 
>>> solution without an actual problem, and is a situation we already have 
>>> today with default method implementations.
>> 
>> The simplest real-world case is easy:
>> 
>>  struct Foo { var data:String }
>>  extension Foo : Equatable {} // This currently produces an error, in 
>> future it will not
>> 
>> 
>> Why is this a problem? It's no different than if someone extended Foo to 
>> conform to a protocol with a default implementation that was written in code.
> 
> I'm sorry but I have now explained why it isn't multiple times; a 
> non-reflective default conformance can ONLY act upon methods and properties 
> that the protocol itself has defined, meaning that it knows everything it 
> needs to know in order to do whatever it wants to do with those methods and 
> properties because it defined them.
Just because it might have defined the properties does not necessarily mean 
that those properties are sufficient context for providing a default 
implementation:
protocol Fooable : Equatable { // Equatable is just a simple example
var myFoo: Int { get }
}

extension Fooable {
static func ==(_ lhs: Self, _ rhs: Self) -> Bool {
return lhs.myFoo == rhs.myFoo
}
}

struct X : Fooable {
let myFoo: Int
let myName: String
// Whoops, forgot to give an implementation of ==
}

print(X(myFoo: 42, myName:

Re: [swift-evolution] [Proposal] Explicit Synthetic Behaviour

2017-09-07 Thread Itai Ferber via swift-evolution
I think there is a bit of confusion here as to what code synthesis does 
— synthesized conformances (whether `Equatable`, `Hashable`, or 
`Codable`) merely provide default implementations for something which 
_already_ conforms to one of these protocols; they do not _add_ 
conformance to types on your behalf.


```swift
struct X {
let val: Int
}
```

under synthesized `Equatable` does _not_ get an `==` defined for it in 
the same way that it does not get `encode(to:)` or `init(from:)`. Since 
it does not conform to the `Equatable` (or `Codable`) protocol, no 
synthesis happens for it.


As opposed to

```swift
struct Y : Equatable {
let val: Int
}
```

which would get a default implementation for `static func ==(…)`, 
which it would otherwise already have to implement, by definition.


Synthesis does not add methods on your behalf; it only gives 
implementations for methods you’d have to implement, no matter what. I 
don’t know what’s going on in your case, but it’s not caused by 
synthesis — if your type conforms to `Equatable`, either you would 
have to define `==` yourself, or you’d get a free one. You’d see 
ambiguity regardless, since you asked for the type to be `Equatable` (or 
inherited that requirement).


On 7 Sep 2017, at 10:32, Gwendal Roué via swift-evolution wrote:

> Le 7 sept. 2017 à 14:37, Matthew Johnson  a 
écrit :


I don't understand what this has to do with synthesized Equatable.  
Wouldn't manually implemented Equatable have the same impact?  The 
design of a DSL should be able to accommodate conformance to basic 
protocols without ambiguity.



I'll explain you:

The problem with synthesized Equatable is that it adds an unwanted == 
operator that returns a Bool.


This operator is unwanted because it conflicts with the == operator 
defined by the DSL which does not return a Bool.


// Without synthesised Equatable
let r = (a == b) // the type defined by the DSL

// With synthesised Equatable
let r = (a == b) // ambiguous

This is the same kind of conflict that happen when a function is 
overloaded with two return types:


func f() -> Int { ... }
func f() -> String { ... }
f() // ambiguous

Without the synthesized Equatable, the type would not have any == 
operator that returns a Bool, and thus no conflict with the == 
operator defined by the DSL (the one that returns an SQL expression, 
in our particular context).


I hope that I have explained how synthesized conformance may impact 
code by the mere fact that they define methods. I'm not talking about 
the correctness of the synthesized code. I'm talking about its mere 
existence.


We generally want as many types to be Equatable and Hashable as 
possible.  Synthesized conformance means more types will have these 
conformance and that's a good thing in all cases (so long as the 
implementation is correct).


Sure, of course. I'm with you. I'm not talking against code synthesis. 
Again, I'm not talking about the correctness either.


I'm talking about the consequences of implicit and non-avoidable 
synthesis. Exactly the theme of this thread, unless I'm totally 
mistaken.


Gwendal Roué




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Pitch] DateComponents{Encoding/Decoding}Strategy in JSON{Encoder/Decoder}

2017-09-06 Thread Itai Ferber via swift-evolution

Hi Pitiphong,

Don’t worry — your original email was clear, and we are on the same 
page about `Date{En,De}codingStrategy` and 
`DateComponents{En,De}codingStrategy` being separate things.

To clarify my points, though, there are two main things I want to say:

1. I think there is a mismatch here between your goal of representing 
the components of a date (and what `DateComponents` can specifically 
hold) and the goal of ISO 8601
2. I think that there is an inherent problem in parsing `DateComponents` 
due to ambiguity


I think both of these issues can be solved by reading and writing a 
`Date` (formatted however you need it to be) instead of 
`DateComponents`.


To elaborate:

* `DateComponents` is meant to be a container for an _arbitrary_ subset 
of information about a `Date`. A `Date` represents a specific instant in 
time, but `DateComponents` are effectively meaningless without 
additional context. In the examples that you give, it’s possible to 
represent the concepts at hand with `DateComponents`, but in order to 
make those components actionable and meaningful, you still need to 
convert them to `Date`s. Note also that:


* It’s entirely possible to create a `DateComponents` which 
represents a date which does not exist, or a time which does not exist
* Any of these concepts can also be represented by a `Date` instead 
of just components; e.g., an all-day event can be represented by a 
`Date` that represents the beginning of the day (`00:00:00`) and a flag 
that indicates that the time of the event can be ignored, or by a start 
`Date` that represents the start of the day and and end `Date` that 
represents the end of the day


* Unlike `DateComponents`, ISO 8601 strings have some structure to them. 
They cannot represent just a time zone, for instance, or some singular 
components of a date/time (e.g. a month without a year, a day without a 
month and year, a minute without an hour, a second without a minute and 
hour, etc.). I think this is a relatively large conceptual mismatch that 
is worth considering deeply. There are a lot of `DateComponents` 
instances which simply cannot be represented by an ISO 8601 string
* There is also the issue of decoding arbitrary ISO 8601 strings into 
`DateComponents`. `DateComponents`, having no structure at all, have no 
specified format they can expect to decode from, and ISO 8601 does not 
always provide that structure. Consider the following example:


* ISO 8601 allows for date representations by year, month, and day 
(`-MM-DD`), among other forms. But it also allows days to be left 
unspecified (`-MM`), and even months (``)
* Similarly, it allows for a time representations by hour, minute, 
and second (`hh:mm:ss`), but also just hour and minute (`hh:mm`), and 
just hour (`hh`). Importantly, it allows time separators to be omitted 
(`hhmmss`, `hhmm`, `hh`)
* Consider then, attempting to parse the string `"2017"` without 
any context — what `DateComponents` should be read out? Intuitively, 
`2017` looks like a year (``), but it is equally valid to parse as 
the time `20:17` (`hhmm`). Without knowing the expected format, parsing 
is ambiguous


  We cannot promise to parse `DateComponents` in all cases because 
there are many combinations of strings that are just completely 
ambiguous.


* So, to get at the core of this — if there is a specific format that 
you would like to encode to and from, why not do so with a `Date` and a 
`DateFormatter` (or if you need ISO 8601 specifically, 
`ISO8601DateFormatter`)? With a formatter, the format is unambiguous 
because you explicitly provide it, and there is nothing the date can’t 
represent that `DateComponents` can. You can always parse the date and 
pull out only those components that you care about. You also mention 
interoperability with an external JSON source — how is that source 
producing a string/parsing one back? [What I’m getting at here is: 
what is the value of adding a new, potentially risky strategy over 
existing methods that might work just as well, or better?]
* And lastly, if `.iso8601` is not necessarily a good fit for this 
strategy, what separates `.custom` from just overriding `encode(to:)` 
and `init(from:)` and writing the components out in the format that you 
need?


I think answers to these questions can help us push this forward. :)

— Itai

On 5 Sep 2017, at 10:41, Pitiphong Phongpattranont wrote:


Hi Itai,

I think my first pitch email was not clear enough and want to sorry 
for that. I have been working on a calendar app for awhile and 
understand the concept of calendar or date and time programming in 
some level. I didn’t pitch the idea of encoding and decoding `Date` 
value with this `DateComponents{Encoding/Decoding}Strategy`. I still 
agree that `Date` value should be encoded/decoded with the 
`Date{Encoding/Decoding}Strategy`. The 
DateComponents{Encoding/Decoding}Strategy I pitched only apply for 
`DateComponents` value only.


About the u

Re: [swift-evolution] [Pitch] DateComponents{Encoding/Decoding}Strategy in JSON{Encoder/Decoder}

2017-09-05 Thread Itai Ferber via swift-evolution

Hi Pitiphong,

Thanks for pitching this! My main question here is about the use case. 
Since encoding/decoding strategies apply to all values in a payload 
(whether or not those belong to types that you own), they inherently 
come with some risk.
What is the use case in mind for needing to encode and decode 
`DateComponents` directly, as opposed to encoding and decoding a `Date` 
instance and pulling the components you need from that?


From a correctness standpoint, I also want to point out that 
`DateComponents` is really just a "bag of stuff" that doesn’t 
necessarily mean much until converted into a `Date` through a `Calendar` 
and a `TimeZone`. There is somewhat of a mismatch between this "bag of 
stuff" and what ISO 8601 intends to represent — an actual date and 
time. It’s possible to represent things in a `DateComponents` that 
don’t really make sense for (or are not supported by) 
ISO-8601-formatted dates. For instance, you can have a `DateComponents` 
which just has a `TimeZone`, but ISO 8601 does not allow representing a 
time zone without a corresponding time. `DateComponents` also, for 
instance, has a `quarter` component (among others) which I’m almost 
certain ISO 8601 has no equivalent for.


Given that conceptual mismatch, I think we’d need a very compelling 
use case to support this over simply using `Date`.


— Itai

On 3 Sep 2017, at 0:55, Pitiphong Phongpattranont via swift-evolution 
wrote:


Hi folks, I have an idea on improving the JSON{Encoder/Decoder} to 
pitch.


Since JSON doesn’t have a native representation for `DateComponents` 
like it doesn’t have for `Date` too so that there’re many ways to 
represent it in JSON, for example ISO 8601, UNIX timestamp, etc. for 
Date. There are also a few ways to represent `DateComponents` too, for 
example ISO 8601 (https://en.wikipedia.org/wiki/ISO_8601) also 
describes how to represent some of the valid date components (e.g. 
"2017-09-03”).  Unlike what JSON{Encoder/Decoder} does to represent 
`Date` value with several strategy but there is no support like that 
for `DateComponents`.


The current implementation DateComponents is to encode/decode with 
KeyedContainer and cannot provide a custom or ISO 8601 compatible 
implementation. So I think JSON{Encoder/Decoder} should have a 
strategy for encoding/decoding `DateComponents` just like for Date


Here’s an initial `DateComponentsStrategy` strategy that I want 
JSON{Encoder/Decoder}  I can think of now, any suggestion is welcomed.


```swift
  /// The strategy to use for encoding `DateComponents` values.
  public enum DateComponentsStrategy {
/// Defer to `Date` for choosing an encoding. This is the default 
strategy.

case deferredToDateComponents

/// Encode the `Date` as an ISO-8601-formatted string (in RFC 3339 
format).

case iso8601

/// Encode the `Date` as a custom value encoded by the given 
closure.

///
/// If the closure fails to encode a value into the given encoder, 
the encoder will encode an empty automatic container in its place.

case custom((DateComponents, Encoder) throws -> Void)
  }
```

What do you guys think about this pitch?


Pitiphong Phongpattranont
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Why you can't make someone else's class Decodable: a long-winded explanation of 'required' initializers

2017-08-07 Thread Itai Ferber via swift-evolution


> On Aug 6, 2017, at 12:58 PM, Charles Srstka  wrote:
> 
>> On Aug 3, 2017, at 12:05 PM, Itai Ferber via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>> Thanks for putting these thoughts together, Jordan! Some additional comments 
>> inline.
>> 
>>> On Aug 2, 2017, at 5:08 PM, Jordan Rose >> <mailto:jordan_r...@apple.com>> wrote:
>>> 
>>> David Hart recently asked on Twitter 
>>> <https://twitter.com/dhartbit/status/891766239340748800> if there was a 
>>> good way to add Decodable support to somebody else's class. The short 
>>> answer is "no, because you don't control all the subclasses", but David 
>>> already understood that and wanted to know if there was anything working to 
>>> mitigate the problem. So I decided to write up a long email about it 
>>> instead. (Well, actually I decided to write a short email and then failed 
>>> at doing so.)
>>> 
>>> The Problem
>>> 
>>> You can add Decodable to someone else's struct today with no problems:
>>> 
>>> extension Point: Decodable {
>>>   enum CodingKeys: String, CodingKey {
>>> case x
>>> case y
>>>   }
>>>   public init(from decoder: Decoder) throws {
>>> let container = try decoder.container(keyedBy: CodingKeys.self)
>>> let x = try container.decode(Double.self, forKey: .x)
>>> let y = try container.decode(Double.self, forKey: .y)
>>> self.init(x: x, y: y)
>>>   }
>>> }
>>> 
>>> But if Point is a (non-final) class, then this gives you a pile of errors:
>>> 
>>> - init(from:) needs to be 'required' to satisfy a protocol requirement. 
>>> 'required' means the initializer can be invoked dynamically on subclasses. 
>>> Why is this important? Because someone might write code like this:
>>> 
>>> func decodeMe() -> Result {
>>>   let decoder = getDecoderFromSomewhere()
>>>   return Result(from: decoder)
>>> }
>>> let specialPoint: VerySpecialSubclassOfPoint = decodeMe()
>>> 
>>> …and the compiler can't stop them, because VerySpecialSubclassOfPoint is a 
>>> Point, and Point is Decodable, and therefore VerySpecialSubclassOfPoint is 
>>> Decodable. A bit more on this later, but for now let's say that's a 
>>> sensible requirement.
>>> 
>>> - init(from:) also has to be a 'convenience' initializer. That one makes 
>>> sense too—if you're outside the module, you can't necessarily see private 
>>> properties, and so of course you'll have to call another initializer that 
>>> can.
>>> 
>>> But once it's marked 'convenience' and 'required' we get "'required' 
>>> initializer must be declared directly in class 'Point' (not in an 
>>> extension)", and that defeats the whole purpose. Why this restriction?
>>> 
>>> 
>>> The Semantic Reason
>>> 
>>> The initializer is 'required', right? So all subclasses need to have access 
>>> to it. But the implementation we provided here might not make sense for all 
>>> subclasses—what if VerySpecialSubclassOfPoint doesn't have an 'init(x:y:)' 
>>> initializer? Normally, the compiler checks for this situation and makes the 
>>> subclass reimplement the 'required' initializer…but that only works if the 
>>> 'required' initializers are all known up front. So it can't allow this new 
>>> 'required' initializer to go by, because someone might try to call it 
>>> dynamically on a subclass. Here's a dynamic version of the code from above:
>>> 
>>> func decodeDynamic(_ pointType: Point.Type) -> Point {
>>>   let decoder = getDecoderFromSomewhere()
>>>   return pointType.init(from: decoder)
>>> }
>>> let specialPoint = decodeDynamic(VerySpecialSubclassOfPoint.self)
>>> 
>>> 
>>> The Implementation Reason
>>> 
>>> 'required' initializers are like methods: they may require dynamic 
>>> dispatch. That means that they get an entry in the class's dynamic dispatch 
>>> table, commonly known as its vtable. Unlike Objective-C method tables, 
>>> vtables aren't set up to have entries arbitrarily added at run time.
>>> 
>>> (Aside: This is one of the reasons why non-@objc methods in Swift 
&g

Re: [swift-evolution] Why you can't make someone else's class Decodable: a long-winded explanation of 'required' initializers

2017-08-04 Thread Itai Ferber via swift-evolution
To clarify a bit here — this isn’t a "privilege" so much so as a 
property of the design of these classes.
`NSData`, `NSString`, `NSArray`, and some others, are all known as 
_class clusters_; the classes you know and use are essentially abstract 
base classes whose implementation is given in private concrete 
subclasses that specialize based on usage. These classes are essentially 
an abstract interface for subclasses to follow. You can take a look at 
the [subclassing notes for 
`NSArray`](https://developer.apple.com/documentation/foundation/nsarray#1651549), 
for instance, to see the guidelines offered for subclassing such a base 
class.


The reason you can relatively safely offer `static` extensions on these 
types is that it’s reasonably rare to need to subclass them, and at 
that, even rarer to offer any interface _besides_ what’s given by the 
base class. You can rely on the, say, `NSString` interface to access all 
functionality needed to represent a string. If I were to subclass 
`NSString` with totally different properties, though, your `static` 
extension might not take that into account.


Not all types you list here are class clusters, BTW, but they largely 
fall into the same category of "never really subclassed". There’s no 
real need for anyone to subclass `NSDate` or `NSDecimalNumber` (since 
they’re pretty low-level structural types), so this should apply to 
those as well.


In general, this property applies to all types like this which are 
rarely subclassed. In Swift, types like this might fall under a `final 
class` designation, though in Objective-C it’s more by convention/lack 
of need than by strict enforcement. There’s a reason we offer some of 
these as `struct`s in Swift (e.g. `Date`, `Decimal`, `Data`, etc.).


On 3 Aug 2017, at 21:03, Gwendal Roué wrote:


Le 3 août 2017 à 19:10, Itai Ferber  a écrit :

I just mentioned this in my other email, but to point out here: the 
reason this works in your case is because you adopt these methods as 
static funcs and can reasonably rely on subclasses of NSData, 
NSNumber, NSString, etc. to do the right thing because of work done 
behind the scenes in the ObjC implementations of these classes (and 
because we’ve got established subclassing requirements on these 
methods — all subclasses of these classes are going to look 
approximately the same without doing anything crazy).


This would not work for Codable in the general case, however, where 
subclasses likely need to add additional storage, properties, encoded 
representations, etc., without equivalent requirements, either via 
additional protocols or conventions.


Thaks for your explanation why a static method in a protocol is able 
to instantiate non final classes like NSData, NSDate, NSNumber, 
NSDecimalNumber, NSString, etc.


Is this "privilege" stable? Can I rely on it to be maintained over 
time? Or would it be a better idea to drop support for those low-level 
Foundation classes, because they'll eventually become regular classes 
without any specific support? This would not harm that much: Data, 
Date, String are there for a reason. NSDecimalNumber is the only one 
of its kind, though.


Gwendal
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Why you can't make someone else's class Decodable: a long-winded explanation of 'required' initializers

2017-08-03 Thread Itai Ferber via swift-evolution
I just mentioned this in my other email, but to point out here: the reason this 
works in your case is because you adopt these methods as static funcs and can 
reasonably rely on subclasses of NSData, NSNumber, NSString, etc. to do the 
right thing because of work done behind the scenes in the ObjC implementations 
of these classes (and because we’ve got established subclassing requirements on 
these methods — all subclasses of these classes are going to look approximately 
the same without doing anything crazy).

This would not work for Codable in the general case, however, where subclasses 
likely need to add additional storage, properties, encoded representations, 
etc., without equivalent requirements, either via additional protocols or 
conventions.

> On Aug 3, 2017, at 1:50 AM, Gwendal Roué via swift-evolution 
>  wrote:
> 
> 
>> Le 3 août 2017 à 02:09, Jordan Rose via swift-evolution 
>> mailto:swift-evolution@swift.org>> a écrit :
>> 
>> P.S. There's a reason why Decodable uses an initializer instead of a 
>> factory-like method on the type but I can't remember what it is right now. I 
>> think it's something to do with having the right result type, which would 
>> have to be either 'Any' or an associated type if it wasn't just 'Self'. (And 
>> if it is 'Self' then it has all the same problems as an initializer and 
>> would require extra syntax.) Itai would know for sure.
> 
> For anyone interested, factory methods *can* retroactivaly be added to 
> existing classes. This is how the SQLite library GRDB.swift is able to decode 
> classes hierarchies like NSString, NSNumber, NSDecimalNumber, etc. from 
> SQLite values:
> 
> The protocol for types that can instantiate from SQLite values has a factory 
> method:
> 
> public protocol DatabaseValueConvertible {
> /// Returns a value initialized from *dbValue*, if possible.
> static func fromDatabaseValue(_ dbValue: DatabaseValue) -> Self?
> }
> 
> Having Foundation classes implement it uses various techniques:
> 
> 1. "Casting" (Data to NSData, or NSDate to Date, depending on which type 
> provides the root conformance)
> 
> // Workaround Swift inconvenience around factory methods of non-final 
> classes
> func cast(_ value: T) -> U? {
> return value as? U
> }
> 
> extension NSData : DatabaseValueConvertible {
> public static func fromDatabaseValue(_ dbValue: DatabaseValue) -> 
> Self? {
> // Use Data conformance
> guard let data = Data.fromDatabaseValue(dbValue) else {
> return nil
> }
> return cast(data)
> }
> }
> 
> // Derives Date conformance from NSDate, for example
> extension ReferenceConvertible where Self: DatabaseValueConvertible, 
> Self.ReferenceType: DatabaseValueConvertible {
> public static func fromDatabaseValue(_ dbValue: DatabaseValue) -> 
> Self? {
> return ReferenceType.fromDatabaseValue(dbValue).flatMap { 
> cast($0) }
> }
> }
> 
> 
> 2. Using magic Foundation initializers (magic because the code below compiles 
> even if those are not *required* initializers). Works for NSNumber, 
> NSDecimalNumber, NSString:
> 
> extension NSNumber : DatabaseValueConvertible {
> public static func fromDatabaseValue(_ dbValue: DatabaseValue) -> 
> Self? {
> switch dbValue.storage {
> case .int64(let int64):
> return self.init(value: int64)
> case .double(let double):
> return self.init(value: double)
> default:
> return nil
> }
> }
> }
> 
> extension NSString : DatabaseValueConvertible {
> public static func fromDatabaseValue(_ dbValue: DatabaseValue) -> 
> Self? {
> // Use String conformance
> guard let string = String.fromDatabaseValue(dbValue) else {
> return nil
> }
> return self.init(string: string)
> }
> }
> 
> The magic about Foundation initializers above makes me doubt that this 
> technique is general enough for Decodable to profit from it, though. Yes it 
> runs on Linux, so I'm not even sure if objc runtime is required or not. I'm 
> clueless ???
> 
> Gwendal Roué
> 
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Why you can't make someone else's class Decodable: a long-winded explanation of 'required' initializers

2017-08-03 Thread Itai Ferber via swift-evolution
Thanks for putting these thoughts together, Jordan! Some additional comments 
inline.

> On Aug 2, 2017, at 5:08 PM, Jordan Rose  wrote:
> 
> David Hart recently asked on Twitter 
>  if there was a good 
> way to add Decodable support to somebody else's class. The short answer is 
> "no, because you don't control all the subclasses", but David already 
> understood that and wanted to know if there was anything working to mitigate 
> the problem. So I decided to write up a long email about it instead. (Well, 
> actually I decided to write a short email and then failed at doing so.)
> 
> The Problem
> 
> You can add Decodable to someone else's struct today with no problems:
> 
> extension Point: Decodable {
>   enum CodingKeys: String, CodingKey {
> case x
> case y
>   }
>   public init(from decoder: Decoder) throws {
> let container = try decoder.container(keyedBy: CodingKeys.self)
> let x = try container.decode(Double.self, forKey: .x)
> let y = try container.decode(Double.self, forKey: .y)
> self.init(x: x, y: y)
>   }
> }
> 
> But if Point is a (non-final) class, then this gives you a pile of errors:
> 
> - init(from:) needs to be 'required' to satisfy a protocol requirement. 
> 'required' means the initializer can be invoked dynamically on subclasses. 
> Why is this important? Because someone might write code like this:
> 
> func decodeMe() -> Result {
>   let decoder = getDecoderFromSomewhere()
>   return Result(from: decoder)
> }
> let specialPoint: VerySpecialSubclassOfPoint = decodeMe()
> 
> …and the compiler can't stop them, because VerySpecialSubclassOfPoint is a 
> Point, and Point is Decodable, and therefore VerySpecialSubclassOfPoint is 
> Decodable. A bit more on this later, but for now let's say that's a sensible 
> requirement.
> 
> - init(from:) also has to be a 'convenience' initializer. That one makes 
> sense too—if you're outside the module, you can't necessarily see private 
> properties, and so of course you'll have to call another initializer that can.
> 
> But once it's marked 'convenience' and 'required' we get "'required' 
> initializer must be declared directly in class 'Point' (not in an 
> extension)", and that defeats the whole purpose. Why this restriction?
> 
> 
> The Semantic Reason
> 
> The initializer is 'required', right? So all subclasses need to have access 
> to it. But the implementation we provided here might not make sense for all 
> subclasses—what if VerySpecialSubclassOfPoint doesn't have an 'init(x:y:)' 
> initializer? Normally, the compiler checks for this situation and makes the 
> subclass reimplement the 'required' initializer…but that only works if the 
> 'required' initializers are all known up front. So it can't allow this new 
> 'required' initializer to go by, because someone might try to call it 
> dynamically on a subclass. Here's a dynamic version of the code from above:
> 
> func decodeDynamic(_ pointType: Point.Type) -> Point {
>   let decoder = getDecoderFromSomewhere()
>   return pointType.init(from: decoder)
> }
> let specialPoint = decodeDynamic(VerySpecialSubclassOfPoint.self)
> 
> 
> The Implementation Reason
> 
> 'required' initializers are like methods: they may require dynamic dispatch. 
> That means that they get an entry in the class's dynamic dispatch table, 
> commonly known as its vtable. Unlike Objective-C method tables, vtables 
> aren't set up to have entries arbitrarily added at run time.
> 
> (Aside: This is one of the reasons why non-@objc methods in Swift extensions 
> can't be overridden; if we ever lift that restriction, it'll be by using a 
> separate table and a form of dispatch similar to objc_msgSend. I sent a 
> proposal to swift-evolution about this last year but there wasn't much 
> interest.)
> 
> 
> The Workaround
> 
> Today's answer isn't wonderful, but it does work: write a wrapper struct that 
> conforms to Decodable instead:
> 
> struct DecodedPoint: Decodable {
>   var value: Point
>   enum CodingKeys: String, CodingKey {
> case x
> case y
>   }
>   public init(from decoder: Decoder) throws {
> let container = try decoder.container(keyedBy: CodingKeys.self)
> let x = try container.decode(Double.self, forKey: .x)
> let y = try container.decode(Double.self, forKey: .y)
> self.value = Point(x: x, y: y)
>   }
> }
> 
> This doesn't have any of the problems with inheritance, because it only 
> handles the base class, Point. But it makes everywhere else a little less 
> convenient—instead of directly encoding or decoding Point, you have to use 
> the wrapper, and that means no implicitly-generated Codable implementations 
> either.
> 
> I'm not going to spend more time talking about this, but it is the officially 
> recommended answer at the moment. You can also just have all your own types 
> that contain points manually decode the 'x' and 'y' values and then construct 
> a Point from that.
I would actually take th

Re: [swift-evolution] Idea: Exposing _JSONEncoder and _JSONDecoder functionality

2017-07-25 Thread Itai Ferber via swift-evolution

Hi Morten,

This is something we’ve considered adding and may do so in the future 
— however, this will require additional API review and will not make 
it in time for the Swift 4.0 release.
The usage of `JSONSerialization` as the serialization backend is a 
current implementation detail, and may change in future releases; it 
would, for instance, be more efficient to read/write JSON as we 
encode/decode, instead of trying to collect the entire object graph 
before performing the next step.


We could also introduce something like a general 
`StructureEncoder`/`StructureDecoder` which performs this conversion, as 
this might be useful outside of just JSON. (For instance, 
`PropertyListEncoder`/`PropertyListDecoder` currently do something 
similar.)


So yes, this is under consideration for future API. :)

— Itai

On 25 Jul 2017, at 0:44, Morten Bek Ditlevsen via swift-evolution wrote:

In the implementation og JSONEncoder and JSONDecoder we have the 
internal
functionality that encodes Codable types to a structure that is 
compatible

with JSONSerialization.
The JSONEncoder then calls JSONSerialization on this structure - and
JSONDecoder vice versa.
In some situations it could be nice to just encode or decode to or 
from
this structure - and not have the final encoding to Data performed. 
For
instance if you have some library that hands you a structure in this 
format

and you would like to perform decoding.

Can anyone else see the relevance of exposing this functionality? It 
could
be exposed by wrapping _JSONEncoder in a public JSONStructureEncoder 
or

something similar.

Sincerely,
/morten
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Idea: Use for default implementation of ?

2017-07-12 Thread Itai Ferber via swift-evolution
This would be possible, but I have the following concerns about doing 
something like this:


* This would only be available when Foundation is imported (since this 
would require access to `JSONEncoder`). This could make debugging 
confusing if you don’t always import Foundation
* What do you get when encoding a value fails? Or for values which would 
otherwise require a strategy to encode correctly? (e.g. you’ve got a 
property whose value is `Double.infinity` which is unrepresentable in 
JSON)
* From some experience, I feel like this could lead to easy abuse of 
conversion to JSON (what’s easier, creating a `JSONEncoder` and 
encoding, or asking for the `debugDescription` on a type?)… Once 
people start relying on this, too, we’ll have compatibility problems 
if we ever want to change the format
* And along with that, I don’t necessarily feel comfortable with 
promoting JSON in this way. It’s very popular today, but then again, 
there was a time where XML was the format du jour… 😬


Interested in hearing more thoughts and input, though!

On 12 Jul 2017, at 15:51, William Shipley via swift-evolution wrote:

Would it be possible for items that implement  to 
automatically get  with a default 
implementation that just calls encode into a JSON blob and then dumps 
it to the screen?


Any drawbacks to doing this?

-Wil
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Request for Feedback] Providing defaults for reading and writing.

2017-07-12 Thread Itai Ferber via swift-evolution

That’s fair. :)
I think in the time frame of Swift 4, this would be too big of an 
addition and would require more thought, but:


1. When the conditional conformance feature arrives in a future Swift 
release, a lot of the hacks surrounding `Equatable` can go away here, 
because we’ll get things like `Array : Equatable where 
Element : Equatable` and `Array : Codable where Element : 
Codable`
2. This seems like an easily additive feature — overloads taking 
defaults can be added after the fact (given a default implementation 
which does something similar to what you and Randy suggested):


```swift
// Just an example:
extension KeyedEncodingContainerProtocol {
func encode(_ value: T, forKey key: Key, defaultValues 
defaults: [Key : Any]) throws where T : Equatable {

guard let defaultValue = defaults[key],
  value != defaultValue else {
return try encode(value, forKey: key)
}
}
}

extension KeyedDecodingContainerProtocol {
	func decode(_ type: T.Type, forKey key: Key, 
defaultValues defaults: [Key : Any]) throws -> T {

guard let defaultValue = defaults[key] else {
return try decode(type, forKey: key)
}

if let value = try decodeIfPresent(type, forKey: key) {
return value
} else {
return defaultValue
}
}
}
```

On 11 Jul 2017, at 13:16, William Shipley wrote:

You’re right, my current implementation doesn’t win anything over 
what you’re written -  in fact your technique is basically what I 
wrote at first.


I was trying to work towards encapsulating the behavior in the 
encoder/decoder so that the automatic init/encode methods could work, 
so I wanted to introduce my first (more manual) attempt and then say, 
here’s where I’d like to get with this.


-Wil



On Jul 11, 2017, at 10:16 AM, Itai Ferber  wrote:

Hi Wil,

Thanks for putting this together! My biggest thought on this is — 
what does this provide that you can’t already do yourself today?
Since you have to go through the work to put together default values 
and override init(from:) and encode(to:) to use them, I’m wondering 
whether this saves you any work over doing something like the 
following:


struct Theme {
private static let _defaultName = ""
private static let _defaultStyles: [String] = []

public let name: String
public let styles: [String]

private enum CodingKeys : String, CodingKey {
case name
case styles
}

public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: 
CodingKeys.self)
name = try? decoder.decode(String.self, forKey: .name) ?? 
Theme._defaultName
styles = try? decoder.decode([String.self], forKey: .styles) 
?? Theme._defaultStyles

}

public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)
if (name != Theme._defaultName) try container.encode(name, 
forKey: .name)
if (styles != Theme._defaultStyles) try 
container.encode(styles, forKey: .styles)

}
}
This reads just as clearly to me as the defaults: variation while 
having the added benefit of low complexity and stronger type safety 
(as there’s no as!-casting down from Any, which could fail).


Thoughts?

— Itai

On 10 Jul 2017, at 17:16, William Shipley via swift-evolution wrote:

Automatic substitution / removal of default values is very useful 
when reading or writing a file, respectively, and should be supported 
by the  family of protocols and objects:


• When reading, swapping in a default value for missing or 
corrupted values makes it so hand-created or third-party-created 
files don’t have to write every single value to make a valid file, 
and allows slightly corrupted files to auto-repair (or get close, and 
let the user fix up any data that needs it after) rather than 
completely fail to load. (Repairing on read creates a virtuous cycle 
with user-created files, as the user will get _some_ feedback on her 
input even if she’s messed up, for example, the type of one of the 
properties.)


• When writing, providing a default value allows the container to 
skip keys that don’t contain useful information. This can 
dramatically reduce file sizes, but I think its other advantages are 
bigger wins: just like having less source code makes a program easier 
to debug, having less “data code” makes files easier to work with 
in every way — they’re easier to see differences in, easier to 
determine corruption in, easier to edit by hand, and easier to learn 
from.



My first pass attempt at adding defaults to Codable looks like this:


public class ReferencePieceFromModel : Codable {

// MARK: properties
public let name: String = ""
public let styles: [String] = []


// MARK: 
public required init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)

self.name = container.decode

Re: [swift-evolution] [Request for Feedback] Providing defaults for reading and writing.

2017-07-11 Thread Itai Ferber via swift-evolution

Hi Wil,

Thanks for putting this together! My biggest thought on this is — what 
does this provide that you can’t already do yourself today?
Since you have to go through the work to put together default values and 
override `init(from:)` and `encode(to:)` to use them, I’m wondering 
whether this saves you any work over doing something like the following:


```swift
struct Theme {
private static let _defaultName = ""
private static let _defaultStyles: [String] = []

public let name: String
public let styles: [String]

private enum CodingKeys : String, CodingKey {
case name
case styles
}

public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
name = try? decoder.decode(String.self, forKey: .name) ?? 
Theme._defaultName
styles = try? decoder.decode([String.self], forKey: .styles) ?? 
Theme._defaultStyles

}

public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)
if (name != Theme._defaultName) try container.encode(name, 
forKey: .name)
if (styles != Theme._defaultStyles) try 
container.encode(styles, forKey: .styles)

}
}
```

This reads just as clearly to me as the `defaults:` variation while 
having the added benefit of low complexity and stronger type safety (as 
there’s no `as!`-casting down from `Any`, which could fail).


Thoughts?

— Itai

On 10 Jul 2017, at 17:16, William Shipley via swift-evolution wrote:

Automatic substitution / removal of default values is very useful when 
reading or writing a file, respectively, and should be supported by 
the  family of protocols and objects:


• When reading, swapping in a default value for missing or corrupted 
values makes it so hand-created or third-party-created files don’t 
have to write every single value to make a valid file, and allows 
slightly corrupted files to auto-repair (or get close, and let the 
user fix up any data that needs it after) rather than completely fail 
to load. (Repairing on read creates a virtuous cycle with user-created 
files, as the user will get _some_ feedback on her input even if 
she’s messed up, for example, the type of one of the properties.)


• When writing, providing a default value allows the container to 
skip keys that don’t contain useful information. This can 
dramatically reduce file sizes, but I think its other advantages are 
bigger wins: just like having less source code makes a program easier 
to debug, having less “data code” makes files easier to work with 
in every way — they’re easier to see differences in, easier to 
determine corruption in, easier to edit by hand, and easier to learn 
from.



My first pass attempt at adding defaults to Codable looks like this:


public class ReferencePieceFromModel : Codable {

// MARK: properties
public let name: String = ""
public let styles: [String] = []


// MARK: 
public required init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: 
CodingKeys.self)


self.name = container.decode(String.self, forKey: .name, 
defaults: type(of: self).defaultsByCodingKey)
self.styles = container.decode([String].self, forKey: .styles, 
defaults: type(of: self).defaultsByCodingKey)

}
public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)

try container.encode(name, forKey: .name, defaults: type(of: 
self).defaultsByCodingKey)
try container.encode(styles, forKey: .styles, defaults: 
type(of: self).defaultsByCodingKey)

}
private static let defaultsByCodingKey: [CodingKeys : Any] = [
.name : "",
.styles : [String]()
]


// MARK: private
private enum CodingKeys : String, CodingKey {
case name
case styles
}
}

With just a couple additions to the Swift libraries:

extension KeyedDecodingContainer where Key : Hashable {
func decode(_ type: T.Type, forKey key: Key, defaults: [Key : 
Any]) -> T where T : Decodable {
if let typedValueOptional = try? decodeIfPresent(T.self, 
forKey: key), let typedValue = typedValueOptional {

return typedValue
} else {
return defaults[key] as! T
}
}
}

extension KeyedEncodingContainer where Key : Hashable {
mutating func encode(_ value: T, forKey key: Key, defaults: 
[Key : Any]) throws where T : Encodable & Equatable {

if value != (defaults[key] as! T) {
try encode(value, forKey: key)
}
}

mutating func encode(_ value: [T], forKey key: Key, defaults: 
[Key : Any]) throws where T : Encodable & Equatable { // I AM SO SORRY 
THIS IS ALL I COULD FIGURE OUT TO MAKE [String] WORK!

if value != (defaults[key] as! [T]) {
try encode(value, forKey: key)
}
}
}


(Note the horrible hack on KeyedEncodingContainer where I h

Re: [swift-evolution] [Update] Updates to SE-0166 and SE-0167

2017-06-26 Thread Itai Ferber via swift-evolution
Taking your code as an example:
Swift

Swift
struct Foo : Codable {
var prop1: Int?
var prop2: Int?

enum CodingKeys { ... }

init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
prop1 = try container.decodeIfPresent(Int.self, forKey: .prop1)
prop2 = try container.decode(Int?.self, forKey: .prop2)
}

func encode(to encoder: Encoder) throws { ... }
}

try decoder.decode(Foo.self, from: "{ \"prop1\": 42, \"prop2\": 99 
}".data(using: .utf8)!)
// => prop1 == Optional(42), prop2 == Optional(99)

try decoder.decode(Foo.self, from: "{ \"prop1\": null, \"prop2\": 99 
}".data(using: .utf8)!)
// => prop1 == nil, prop2 == Optional(99)

try decoder.decode(Foo.self, from: "{ \"prop1\": 42, \"prop2\": null 
}".data(using: .utf8)!)
// => prop1 == Optional(42), prop2 == nil

try decoder.decode(Foo.self, from: "{ \"prop2\": 99 }".data(using: .utf8)!)
// => prop1 == nil, prop2 == Optional(99)

try decoder.decode(Foo.self, from: "{ \"prop1\": 42 }".data(using: .utf8)!)
// => error, .keyNotFound (key "prop2" is missing)

decode(_:forKey:) always expects the key to be there; if T == Optional 
then the value may be null, but the entry must be present, since that’s what 
you’re asserting.
decodeIfPresent(_:forKey:) will return nil if the key is not present, or if 
T == Optional and the value is null.

(This, BTW, is not a change in semantics from how things work today.)

> On Jun 26, 2017, at 1:03 PM, David Hart  wrote:
> 
> What I still have difficulties understanding is what will be the semantic 
> difference between decodeIfPresent and decode with optional type:
> 
> func init(from decoder: Decoder) throws {
> let container = try decoder.container(keyedBy: CodingKeys.self)
> prop1 = try container.decodeIfPresent(Prop1Type.self, forKey: .prop1)
> prop2 = try container.decode(Optional.self, forKey: .prop2)
> }
> 
>> On 26 Jun 2017, at 19:10, Itai Ferber > <mailto:ifer...@apple.com>> wrote:
>> 
>> Reply-all this time too. :)
>> Thanks for the feedback, David!
>> 
>> encodeIfPresent and decodeIfPresent are not strictly necessary, but they’re 
>> useful for further cutting down boilerplate. encodeIfPresent is equivalent to
>> 
>> if let value = value {
>> try container.encode(value, forKey: .someKey)
>> }
>> and decodeIfPresent is equivalent to
>> 
>> if container.contains(.someKey) {
>> value = try container.decode(Value.self, forKey: .someKey)
>> } else {
>> value = nil
>> }
>> They’re not big, but when you have a long list of optional properties, it’s 
>> much easier to read and comprehend than staring at a wall of Optional 
>> wrapping/unwrapping:
>> 
>> func init(from decoder: Decoder) throws {
>> let container = try decoder.container(keyedBy: CodingKeys.self)
>> 
>> if container.contains(.prop1) {
>> prop1 = try container.decode(Prop1Type.self, forKey: .prop1)
>> } else {
>> prop1 = nil
>> }
>> 
>> if container.contains(.prop2) {
>> prop2 = try container.decode(Prop2Type.self, forKey: .prop2)
>> } else {
>> prop2 = nil
>> }
>> 
>> if container.contains(.prop3) {
>> prop3 = try container.decode(Prop3Type.self, forKey: .prop3)
>> } else {
>> prop3 = nil
>> }
>> }
>> 
>> // vs.
>> 
>> func init(from decoder: Decoder) throws {
>> let container = try decoder.container(keyedBy: CodingKeys.self)
>> prop1 = try container.decodeIfPresent(Prop1Type.self, forKey: .prop1)
>> prop2 = try container.decodeIfPresent(Prop2Type.self, forKey: .prop2)
>> prop3 = try container.decodeIfPresent(Prop3Type.self, forKey: .prop3)
>> }
>> On 23 Jun 2017, at 13:52, David Hart wrote:
>> 
>> There are a lot of great changes here which make sense after the fact. I'll 
>> try to play around with them.
>> 
>> One thing I'm concerned about: with the new Optional conformance, why do we 
>> still need decodeIfPresent and encodeIfPresent? They seem superfluous now, 
>> and potentially confusing. Should users call encodeIfPresent/decodeIfPresent 
>> or encode/decode with an optional type? Do the have the same semantics?
>> 
>> On 23 Jun 2017, at 21:47, Itai Ferber via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>>> Hi swift-evolution,
>>> 
>>> Over the course of the past few weeks,

Re: [swift-evolution] [Update] Updates to SE-0166 and SE-0167

2017-06-26 Thread Itai Ferber via swift-evolution
Reply-all this time too. :)
Thanks for the feedback, David!

encodeIfPresent and decodeIfPresent are not strictly necessary, but they’re 
useful for further cutting down boilerplate. encodeIfPresent is equivalent to

if let value = value {
try container.encode(value, forKey: .someKey)
}
and decodeIfPresent is equivalent to

if container.contains(.someKey) {
value = try container.decode(Value.self, forKey: .someKey)
} else {
value = nil
}
They’re not big, but when you have a long list of optional properties, it’s 
much easier to read and comprehend than staring at a wall of Optional 
wrapping/unwrapping:

func init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)

if container.contains(.prop1) {
prop1 = try container.decode(Prop1Type.self, forKey: .prop1)
} else {
prop1 = nil
}

if container.contains(.prop2) {
prop2 = try container.decode(Prop2Type.self, forKey: .prop2)
} else {
prop2 = nil
}

if container.contains(.prop3) {
prop3 = try container.decode(Prop3Type.self, forKey: .prop3)
} else {
prop3 = nil
}
}

// vs.

func init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
prop1 = try container.decodeIfPresent(Prop1Type.self, forKey: .prop1)
prop2 = try container.decodeIfPresent(Prop2Type.self, forKey: .prop2)
prop3 = try container.decodeIfPresent(Prop3Type.self, forKey: .prop3)
}
On 23 Jun 2017, at 13:52, David Hart wrote:

There are a lot of great changes here which make sense after the fact. I'll try 
to play around with them.

One thing I'm concerned about: with the new Optional conformance, why do we 
still need decodeIfPresent and encodeIfPresent? They seem superfluous now, and 
potentially confusing. Should users call encodeIfPresent/decodeIfPresent or 
encode/decode with an optional type? Do the have the same semantics?

On 23 Jun 2017, at 21:47, Itai Ferber via swift-evolution 
mailto:swift-evolution@swift.org>> wrote:

> Hi swift-evolution,
> 
> Over the course of the past few weeks, we’ve been gathering feedback about 
> the outcome of SE-0166 
> <https://github.com/apple/swift-evolution/blob/master/proposals/0166-swift-archival-serialization.md>
>  and SE-0167 
> <https://github.com/apple/swift-evolution/blob/master/proposals/0167-swift-encoders.md>
>  (both internally and externally), and we gathered a collection of updates 
> that we’re going to introduce to the proposals and to the implementation.
> 
> Attached is rendered HTML (I don’t want to make your mail clients unusable 
> like last time!) that lays out what we’d like to do. We’re not looking to do 
> a full review of these changes, but if you have feedback or questions, we’re 
> happy to get responses here.
> 
> Please note that some of these features have already been implemented (the 
> new error types, some of the optionality changes, collection conformances, 
> etc.), but we are receptive to comments on all of it. The existing proposals 
> will also be updated to incorporate these updates.
> 
> Thanks for all of your feedback!
> 
> — Itai
> 
> 
> ___
> swift-evolution mailing list
> swift-evolution@swift.org <mailto:swift-evolution@swift.org>
> https://lists.swift.org/mailman/listinfo/swift-evolution 
> <https://lists.swift.org/mailman/listinfo/swift-evolution>

> On Jun 24, 2017, at 1:29 AM, David Hart  wrote:
> 
> Sending out again to the whole mailing list ;-)
> 
> There are a lot of great changes here which make sense after the fact. I'll 
> try to play around with them.
> 
> One thing I'm concerned about: with the new Optional conformance, why do we 
> still need decodeIfPresent and encodeIfPresent? They seem superfluous now, 
> and potentially confusing. Should users call encodeIfPresent/decodeIfPresent 
> or encode/decode with an optional type? Do the have the same semantics?
> 
> On 23 Jun 2017, at 21:47, Itai Ferber via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
> 
>> Hi swift-evolution,
>> 
>> Over the course of the past few weeks, we’ve been gathering feedback about 
>> the outcome of SE-0166 
>> <https://github.com/apple/swift-evolution/blob/master/proposals/0166-swift-archival-serialization.md>
>>  and SE-0167 
>> <https://github.com/apple/swift-evolution/blob/master/proposals/0167-swift-encoders.md>
>>  (both internally and externally), and we gathered a collection of updates 
>> that we’re going to introduce to the proposals and to the implementation.
>> 
>> Attached is rendered HTML (I don’t want to make your mail clients unusable 
>> like last time!) that lays out

Re: [swift-evolution] Followup after in-the-field feedback for SE-0170

2017-06-16 Thread Itai Ferber via swift-evolution
Those cases will produce `nil`. "Should" is a matter of expectation, but 
I think that it’s reasonable behavior since it feels truer to the 
intent the `exactly:` methods.


On 16 Jun 2017, at 16:23, Jordan Rose via swift-evolution wrote:

> On Jun 14, 2017, at 13:59, Philippe Hausler via swift-evolution 
 wrote:




On Jun 14, 2017, at 1:47 PM, Dave Abrahams via swift-evolution 
mailto:swift-evolution@swift.org>> 
wrote:



on Wed Jun 14 2017, Philippe Hausler > wrote:



After implementing the proposal
https://github.com/apple/swift-evolution/blob/master/proposals/0170-nsnumber_bridge.md 

>

we have gotten some initial feedback.

It seems that there is definitely a fair amount of confusion and 
heartache for dealing with Float

and Double values being bridged.

Specifically the cases like:

NSNumber(value 0.1) as? Float

which with the current implementation will return nil since the 
Double value 0.1 cannot be

represented exactly as a Float.

It seems like the overwhelming majority of users know full well 
that

Float will result in a loss of precision (hence why they chose that
type over Double). This means that the floating point bridges for
Double, Float, and CGFloat should not be the pedantic “exactly” 
cases
but instead infer the conversion intent of a lax/approximated 
version.


+1


So in short: for the betterment of the API ergonomics, the floating
point types as destinations will be pulled back to their initial 
Swift

3 behavior.

Additionally to aide appropriate migration to the appropriate
truncating/exactly initializers the un-labeled, deprecated in Swift 
4,
plain init methods to the numeric types with an NSNumber will now 
be

annotated with the suggested replacements.


Sorry, I don't quite understand what that last paragraph implies.  
Could
you describe what will be deprecated and what will be suggested 
instead?


https://github.com/phausler/swift/commit/62218c85b6c494c4054ec9774dd6ce095a5d0fa4 



So this is just an annotation of renaming to the init(truncating:)


How about cases where the value is a double that's greater than 
Float.greatestFiniteMagnitude? Should/Will that produce 'nil' or 
Float.infinity?


Jordan




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Possible issue with SE-0166 Swift Archival & Serialization implementation

2017-06-08 Thread Itai Ferber via swift-evolution
Sorry, meant for that to be a reply-all.

> On Jun 8, 2017, at 9:45 AM, Itai Ferber  wrote:
> 
> Hi Gwendal,
> 
>> On Jun 8, 2017, at 8:27 AM, Gwendal Roué via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>>> 
>>> Le 8 juin 2017 à 16:51, James Froggatt via swift-evolution 
>>> mailto:swift-evolution@swift.org>> a écrit :
>>> 
>>> I've just been trying out the new Coding protocol, and was rather surprised 
>>> when trying to implement the `encode(to encoder: Encoder)` method.
>>> 
>>> The Swift evolution proposal provides the following example code:
>>> 
>>>   public func encode(to encoder: Encoder) throws {
>>>   // Generic keyed encoder gives type-safe key access: cannot encode 
>>> with keys of the wrong type.
>>>   let container = encoder.container(keyedBy: CodingKeys.self)
>>> 
>>>   // The encoder is generic on the key -- free key autocompletion here.
>>>   try container.encode(latitude, forKey: .latitude)
>>>   try container.encode(longitude, forKey: .longitude)
>>>   }
>>> 
>>> 
>>> Here, container is stored as a `let` value, and uses reference semantics, 
>>> while the proposal also clearly lists these `encode` methods as mutating. 
>>> With the current implementation of the proposal, the container must be 
>>> stored as a `var`, which leads to code like the following:
>>> 
>>>   var container = encoder.singleValueContainer()
>>>   try container.encode(data)
>> 
>> Yes, practically speaking and with latest Swift 4, the container needs to be 
>> declared as `var`.
>> 
>> I admit it's weird, and feels unnatural:
>> 
>>   public func encode(to encoder: Encoder) throws {
>>   // A mutated value that nobody consumes: so weird.
>>   var container = encoder.container(keyedBy: CodingKeys.self)
>>   try container.encode(latitude, forKey: .latitude)
>>   try container.encode(longitude, forKey: .longitude)
>>   }
> Why? It’s perfectly reasonable for the container to maintain some internal 
> state as it’s encoding. It shouldn’t have to sacrifice value semantics for 
> that.
> 
>>> This clearly wont work as expected if the container were to have value 
>>> semantics, and writing code like this feels plain wrong. Is SE-0166 really 
>>> intended to work with referrence-type encoders only?
>> 
>> Actually, it can work with encoder/containers that have value semantics, and 
>> forward the mutations somewhere else (for example to a closure which fills a 
>> mutating container).
>> 
>> But this is again bizarre, and contrieved: 
>> https://github.com/groue/GRDB.swift/blob/15bfe5f6cf76070cfb17216223bdebc6b158d654/GRDB/Record/Persistable%2BEncodable.swift
>>  
>> 
>> 
>> You make me think that those structs should swiftly be refactored into 
>> reference types.
>> 
>> Gwendal
>> 
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org 
>> https://lists.swift.org/mailman/listinfo/swift-evolution 
>> 
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Possible issue with SE-0166 Swift Archival & Serialization implementation

2017-06-08 Thread Itai Ferber via swift-evolution

Hi James,

Good catch. This is a holdover from an older version of the proposal, 
and is now a typo.
It should be `var container` — the container can be a `struct`, and 
shouldn’t require reference semantics.


— Itai

On 8 Jun 2017, at 7:51, James Froggatt via swift-evolution wrote:

I've just been trying out the new Coding protocol, and was rather 
surprised when trying to implement the `encode(to encoder: Encoder)` 
method.


The Swift evolution proposal provides the following example code:

public func encode(to encoder: Encoder) throws {
// Generic keyed encoder gives type-safe key access: cannot 
encode with keys of the wrong type.

let container = encoder.container(keyedBy: CodingKeys.self)

// The encoder is generic on the key -- free key 
autocompletion here.

try container.encode(latitude, forKey: .latitude)
try container.encode(longitude, forKey: .longitude)
}


Here, container is stored as a `let` value, and uses reference 
semantics, while the proposal also clearly lists these `encode` 
methods as mutating. With the current implementation of the proposal, 
the container must be stored as a `var`, which leads to code like the 
following:


var container = encoder.singleValueContainer()
try container.encode(data)

This clearly wont work as expected if the container were to have value 
semantics, and writing code like this feels plain wrong. Is SE-0166 
really intended to work with referrence-type encoders only?

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Question regarding SE-0167 Swift Encoders

2017-05-31 Thread Itai Ferber via swift-evolution
Hi Gwendal,

I hear your frustration. Some comments inline.

> On May 31, 2017, at 5:36 AM, Gwendal Roué  wrote:
> 
> Itai,
> 
> (This email is not technical)
> 
> I'm not claiming that SE-0166 should be able to address all archival formats. 
> I've been talking about GRDB to show at least one format that SE-0166 doesn't 
> cover well. And should SE-0166 be fixed to support SQL (in the GRDB fashion), 
> this does not mean that other developers won't eventually fight with SE-0166 
> until they understand it does not fit their bill.
I’ll respond to the technical portion of this thread in the other email, but 
let me at least provide some background here. When working on this feature, we 
thought for a very long time about what we were looking to support with this 
feature, and how (feel free to take a look at the Alternatives Considered 
section of the proposal, though of course, there were more attempts and 
approaches before that).
The majority of this thought was put into figuring out what the proper 
abstractions were for applying this new API — how can we abstract over 
different archival and serialization formats in a way that makes this useful?

In truth, if you try to abstract over all archival and serialization formats, 
the abstraction that you get is... the empty set. :) There are simply so many 
different things at odds with one another across different formats (JSON 
supports null values, plist does not; numbers are arbitrary precision in JSON, 
but not in plist or MessagePack or others; plist and MessagePack and others 
support binary data blobs, but JSON does not; etc.) that if you try to abstract 
over them all, you end up with nothing useful — an empty protocol that covers 
nothing.

So the key here is to try to strike a pragmatic balance between supporting some 
of the most common archival and serialization formats in a way that makes them 
useful, even if we have to handle special cases in some of them (e.g. null 
values in plist, binary data in JSON, etc.). It’s true that we cannot support 
them all, but in fact, we’re not looking to, because it would weaken the API.

I will respond to the comments specific to GRDB in the other thread, but this 
is a bit of background. Yes, there will always developers who will not be able 
to fit a serialization format into this API because it is fundamentally 
different in a way that cannot fit with the rest of the formats we’re looking 
to support. There’s nothing to be done about that. But, you mention this 
yourself.

> But there's something very special with SE-0166:
> 
> It's in the standard library, with all the backward-compatibility constraints 
> that come with such a position.
> 
> IT'S BLESSED WITH CODE GENERATION.
> 
> I don't know if you, Michael LeHew, Tony Parker, and the core team, realize 
> the importance of this insanely great privilege granted to this proposal.
Believe me, I do, because we considered a lot of different approaches before 
settling on this. We wanted to avoid code generation for this reason — it has a 
privileged place within the compiler, it generates code which the user may not 
be able to introspect, etc.
At the end of the day, though, we decided on this option because it provided 
the best user experience as part of the language in the vast majority of cases. 
There’s a lot to be said for that, and you mention this yourself, too.

> The lack of introspection and macros in Swift makes SE-0166 immensely 
> attractive for a whole category of libraries.
> 
> When SE-0166 is lacking, should those libs ignore it, and lose CODE 
> GENERATION, which means looking like it's still Swift 3?
> 
> Should those libs claim SE-0166 conformance, and raise runtime errors for 
> invalid inputs (where "invalid" does not mean "invalid data", or "invalid 
> code", but "impossible to fit in SE-0166" <=> "invalid library")?
That being said, let’s separate the capabilities of the Codable API itself from 
the code generated by the compiler for it. While the code generation is a huge 
convenience for the majority of simple cases, it does just that — generate code 
for the simple cases. We cannot arbitrarily generate code to match arbitrary 
applications. Much more is possible with custom encode/decode implementations 
and custom CodingKeys than you might imagine, rather than just sticking to the 
default, compiler-generated implementation. (Data migration, format-specific 
encoded representations, multiple sets of CodingKeys, etc.)

If a library finds use for the Codable APIs only for the code generation, then 
I think that’s likely misapplication of the API. Attempting to use the Codable 
API to fit a square peg into a round hole will be frustrating because, well, it 
was designed for a singular purpose.
The code generation that comes with Codable is meant for archival and 
serialization, not for arbitrary introspection. You’re right in that there is 
an overlap here (and I think the key pain point is that we need better tools 
for doing in

Re: [swift-evolution] Question regarding SE-0167 Swift Encoders

2017-05-30 Thread Itai Ferber via swift-evolution

Hi Gwendal,

There are no stupid questions — everything helps hammer out this API, 
so I appreciate you taking the time to look at this so deeply.
I have to confess that I’m not familiar with this concept, but let’s 
take a look:


```swift
if let valueType = T.self as? DatabaseValueConvertible.Type {
// if column is missing, trigger the "missing key" error or return 
nil.

} else if let complexType = T.self as? RowConvertible.Type {
// if row scope is missing, trigger the "missing key" error or 
return nil.

} else {
// don't know what to do
fatalError("unsupported")
}
```

Is it appropriate for a type which is neither `DatabaseValueConvertible` 
nor `RowConvertible` to be decoded with your decoder? If not, then this 
warrants a `preconditionFailure` or an error of some sort, right? In 
this case, that would be valid.


You also mention that "it’s still impossible to support other Codable 
types" — what do you mean by this? Perhaps there’s a way to 
accomplish what you’re looking to do.
In any case, one option (which is not recommended unless if there are 
other avenues to solve this by) is to perform a "dry run" decoding. 
Attempt to decode the type with a dummy decoder to see what container it 
will need, then prepare your approach and do it again for real. 
Obviously, this isn’t a clean way to do it if we can find 
alternatives, but it’s an option.


— Itai

On 29 May 2017, at 4:51, Gwendal Roué via swift-evolution wrote:


Hello,

I have already asked stupid questions about SE-0167 and SE-0166, but 
this time I hope this is a real one.


According so SE-0166, codable types themselves instantiate a single 
value decoder, or a keyed container:


public struct Farm : Codable {
public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: 
CodingKeys.self
...
}
}

public enum Animal : Int, Codable {
public init(from decoder: Decoder) throws
		let intValue = try 
decoder.singleValueContainer().decode(Int.self)

...
}
}

According to SE-0167, decoder decode non-trivial types in their 
decode(_:forKey:) and decodeIfPresent(_:forKey:) methods:


	func decode(_ type: T.Type, forKey key: Key) throws -> T where T : 
Decodable
	func decodeIfPresent(_ type: T.Type, forKey key: Key) throws -> T? 
where T : Decodable


My trouble is that the decoder does not know whether the Decodable 
type will ask for a keyed container, or for a single value container.


Why is it a problem?

In the context of decoding of SQL rows, keys may refer to different 
things, depending on whether we are decoding a *value*, or a *complex 
object*:


- for values, keys are column names, as everybody can expect
- for complex objects, keys are names of "row scopes". Row scopes are 
a concept introduced by GRDB.swift and allows a type that knows how to 
consume `SELECT * FROM table1` to consume as well the results of 
`SELECT table1.*, table2.* FROM table1 JOIN table2` through a "scope" 
that presents the row in the shape expected by the consumer (here, 
only columns from table1).


This is supposed to allow support for types that contain both nested 
types and values (one of the goals of SE-0166 and SE-0167):


struct Compound : Codable {
		let someStruct: SomeStruct // object that feeds on the "someStruct" 
scope

let name: String // value that feeds on the "name" column
}

The two decoding methods decode(_:forKey:) and 
decodeIfPresent(_:forKey:) can't be implemented nicely, because they 
don't know whether the decodable type will ask for a keyed container 
or a single value container, and thus they don't know whether they 
should look for the presence of a row scope, or of a column:


A workaround is to perform runtime checks on the GRDB protocols 
adopted by T, as below. But it's still impossible to support other 
codable types:


if let valueType = T.self as? DatabaseValueConvertible.Type {
		// if column is missing, trigger the "missing key" error or return 
nil.

} else if let complexType = T.self as? RowConvertible.Type {
		// if row scope is missing, trigger the "missing key" error or 
return nil.

} else {
// don't know what to do
fatalError("unsupported")
}

Do you have any advice?

Gwendal Roué


___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Feedback on SE-0166 and SE-0167

2017-05-27 Thread Itai Ferber via swift-evolution
Excellent! Glad that ended up working out — looking forward to seeing 
this improve quality of life for GRDB users and others. :)


On 27 May 2017, at 3:40, Gwendal Roué wrote:


> Le 27 mai 2017 à 11:59, David Hart  a écrit :


I didn’t know that was possible either! Really cool. Even better:

 if let databaseValueType = T.self as? DatabaseValueConvertible.Type 
{
 let databaseValue: DatabaseValue = row.value(named: 
key.stringValue)

 return databaseValueType.fromDatabaseValue(databaseValue) as! T
 } else { … }



Yes, that's even more clear like that :-)

And for Encodable support, it works just great (not a single line of 
boilerplate code):


private enum Color: String, DatabaseValueConvertible, Encodable {
case red, green, blue
}

private struct EncodableStruct : Persistable, Encodable {
static let databaseTableName = "myTable"
let name: String
let color: Color?
}

try dbQueue.inDatabase { db in
try EncodableStruct(name: "Arthur", color: .red).insert(db)
}

Gwendal



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Feedback on SE-0166 and SE-0167

2017-05-26 Thread Itai Ferber via swift-evolution
Hi Gwendal,

First, thanks for taking the time to adopt some of our new work. Feedback is 
always good, and we appreciate it.
Some comments inline.

> On May 26, 2017, at 7:26 AM, Gwendal Roué via swift-evolution 
>  wrote:
> 
> Hello,
> 
> I want to provide real-life feedback for the Swift Archival & Serialization 
> (SE-0166) and Swift Encoders (SE-0167) proposals that currently ship in Swift 
> 4 snapshots.
> 
> The context: I'm the author of GRDB.swift [1], a SQLite library that, among 
> other goals, aims at easing the conversion between database rows and custom 
> models (structs and class hierarchies):
> 
>   // Sample code
>   let arthur = Player(name: "Arthur", score: 100)
>   try arthur.insert(db)
>   print(arthur.id)
>   
>   let topPlayers = try Player
>   .order(Column("score").desc)
>   .limit(10)
>   .fetchAll(db) // [Player]
> 
> Due to the lack of any introspection in Swift, GRDB currently wants you to 
> perform explicit conversion:
> 
>   struct Player {
>   var id: Int64?
>   let name: String
>   let score: Int
>   }
>   
>   extension Player : RowConvertible {
>   init(row: Row) {
>   id = row.value(named: "id")
>   name = row.value(named: "name")
>   score = row.value(named: "score")
>   }
>   }
>   
>   extension Player : TableMapping, MutablePersistable {
>   static let databaseTableName = "player"
>   var persistentDictionary: [String: DatabaseValueConvertible?] {
>   return ["id": id, "name": name, "score: score]
>   }
>   }
> 
> That's enough, but that's still too much.
> 
> SE-0166 and SE-0167 sound like the promise that some boilerplate code could 
> be automatically generated.
> 
> Along with JSONDecoder and PListDecoder, let's introduce DatabaseRowDecoder! 
> The current state of the work is at 
> https://github.com/groue/GRDB.swift/tree/Swift4
> 
> 
> At first, it's very satisfying. Decodable keeps some of it promises:
> 
>   struct Player : RowConvertible, Decodable {
>   static let databaseTableName = "player"
>   var id: Int64?
>   let name: String
>   let score: Int
>   }
>   
>   // Yeah, no more extra code necessary for this to work!
>   let topPlayers = try Player
>   .order(Column("score").desc)
>   .limit(10)
>   .fetchAll(db)
> 
> But there are some issues.
> 
> 
> ### Issue 1: SE-0166/0167 merge the concepts of keyed objects and values
> 
> This is a problem. Let's take this example:
> 
>   enum Color: Int, Codable {
>   case blue, green, red
>   }
>   
>   struct Flower : RowConvertible, Decodable {
>   let name: String
>   let color: Color
>   }
>   
> The way to decode a color comes from KeyedDecodingContainerProtocol:
> 
>   protocol KeyedDecodingContainerProtocol {
>   func decode(_ type: T.Type, forKey key: Key) throws -> T 
> where T : Decodable
>   func decodeIfPresent(_ type: T.Type, forKey key: Key) throws 
> -> T? where T : Decodable
>   }
> 
> But the ability to decode a Color from a database row comes from the 
> DatabaseValueConvertible, which I can't invoke since I can't test if type T 
> conforms to this protocol:
> 
>   struct RowKeyedDecodingContainer: 
> KeyedDecodingContainerProtocol {
>   let row: Row
>   
>   // Not OK: no support for values
>   func decode(_ type: T.Type, forKey key: Key) throws -> T 
> where T : Decodable {
>   if   {
>   let databaseValue: DatabaseValue = 
> row.value(named: key.stringValue)
>   return T.fromDatabaseValue(databaseValue) 
>   } else { ... }
>   }
>   }
What is preventing you from doing this check? I looked at GRDB code, but 
AFAICT, DatabaseValueConvertible has not associated types, nor does 
SQLExpression, which it adopts.

You should be able to express this code as:

if T.self is DatabaseValueConvertible.Type {
let databaseValue: DatabaseValue = row.value(named: key.stringValue)
return (T.self as! 
DataBaseValueConvertible.Type).fromDatabaseValue(databaseValue) as! T
} else { … }

> So the current state of the Codable library disallow GRDB from supporting 
> value properties which are not the trivial Int, Int32, etc. Of course, GRDB 
> itself makes it possible, with explicit user code. But we're talking about 
> removing boilerplate and relying on the code generation that Codable is 
> blessed with, here. We're talking about sharing the immense privilege that 
> Codable is blessed with.
> 
> However, if I can't decode **values**, I can still decode **complex keyed 

Re: [swift-evolution] Pitch: Automatically deriving Equatable/Hashable for more value types

2017-05-15 Thread Itai Ferber via swift-evolution

> On May 15, 2017, at 4:03 PM, Xiaodi Wu via swift-evolution 
>  wrote:
> 
> This is nice. Thanks for taking the time to write it up. I do have some 
> concerns/questions:
> 
> Do the rules you spell out align with those for Codable? I think it is very 
> important that these are paralleled as closely as possible, and that any 
> deviations are explicitly called out in the text with reasoning as to why it 
> must deviate. Knowing when something is synthesized is difficult enough with 
> one set of rules--two is certainly one too many.
To spell out the rules of Codable conformance clearly, for reference:

> For example, is it permitted to extend a type in the same module in order to 
> obtain synthesized Codable conformance? How about for a type in a different 
> module? The same rules should then apply for Equatable and Hashable synthesis.
Yes, Codable conformance can be added in an extension both intra-module, and 
inter-module (i.e. you can add Codable conformance via extensions in your own 
module, or to types in other modules). If there is a situation where this is 
not possible, that’s likely a bug.
[For reference, it is actually easier to allow this than to prevent it. I had 
to do very little extra work to support this because of how this is organized 
in the compiler.]

> Furthermore, does Codable ignore computed properties? If not, then neither 
> should Equatable and Hashable.
Yes. Derived conformance for Codable ignores all computed properties (including 
lazy properties and their associated storage). This is also some relatively 
easy default behavior; you can iterate all properties matching this requirement 
via `NominalTypeDecl.getStoredProperties` 
(getStoredProperties(/*skipInaccessible=*/true) will skip the storage 
associated with lazy vars).
[The thought process here is that accessing computed vars (and more so lazy 
vars) will generally have side effects. We don’t want to trigger side effects 
on encoding/checking for equality/hashing, and in general, those types of 
properties will not affect equality/hash value/encoded representation.]

> There are also some complicated rules with generics, if I recall, that may 
> force something to be a computed property. It would be worth exploring if 
> such rules make ignoring computed properties counterintuitive. For instance, 
> if a user has to redesign the type, changing a stored property to a computed 
> property just to satisfy certain rules of the language, and all of a sudden 
> the definition of equality has silently changed as a consequence, then that 
> could end up being very hard to debug. If we find that this is a plausible 
> issue, then it might be worth considering refusing to synthesize Equatable 
> conformance for a type with any computed properties--obviously limiting, but 
> better limiting than surprising. To be clear, I'm not suggesting that we do 
> make this limitation, just that I don't know that the consequences have been 
> adequately explored for not including computed properties.
I’m not sure about this — someone else will have to weigh in. I don’t think 
I’ve ever encountered a situation like this while working on Codable. That 
being said, if there’s a limiting factor here that we encounter, we should 
absolutely be consistent between all implementations of derived conformance. 

It would be helpful to document these rules somewhere, so noted.

> On Mon, May 15, 2017 at 17:21 Tony Allevato via swift-evolution 
> mailto:swift-evolution@swift.org>> wrote:
> Yes—the PR of the proposal is here: 
> https://github.com/apple/swift-evolution/pull/706 
> 
> 
> It needs to be updated slightly—I'll remove the references to the 
> "multiplicative hash function" recommendation because I ended up using the 
> existing _mixInt and xor, which is how the standard library implements its 
> Collection hashValues. (The proposal probably really doesn't need to state 
> anything about the hash function used, and its entirely an implementation 
> detail.)
> 
> 
> On Mon, May 15, 2017 at 3:18 PM Andrew Bennett  > wrote:
> Nice work Tony! Is this proposal up for PR on swift-evolution as well?
> 
> On Tue, 16 May 2017 at 7:30 am, Tony Allevato  > wrote:
> Just to update everyone on the thread—it took a little longer than I'd hoped 
> to get the kinks out, but I finally have the implementation up as a PR: 
> https://github.com/apple/swift/pull/9619 
> 
> 
> Hopefully there's still enough time to get the proposal reviewed, make any 
> changes needed, and get this into Swift 4!
> 
> 
> On Tue, May 9, 2017 at 10:27 PM Brent Royal-Gordon  > wrote:
>> On May 9, 2017, at 3:53 PM, Tony Allevato via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>> Likewise, proposing a new public addition to the standard library would 
>> inspire far more design discu

Re: [swift-evolution] [Accepted] SE-0167: Swift Encoders

2017-05-09 Thread Itai Ferber via swift-evolution
Hi Johannes,

This is implementation detail that is subject to change, but JSONEncoder and 
JSONDecoder defer to JSONSerialization to provide the actual serialization to 
and from JSON.
Internally, numbers are represented in NSNumber instances until you ask for 
them on decode, so large integers are indeed non-lossy — you can round-trip the 
values in a 64-bit integer just fine.

If you ask to coerce them as Double values, they will coerce (and lose 
precision), but that is true today.
Of course, it is not possible to encode values out of Double range as Doubles, 
since you cannot construct such a Double value.

To wit, any number value encoded via JSONEncoder will always be round-trippable 
via JSONDecoder provided that you try to decode as the same type as you 
encoded; this we guarantee.

tl;dr: If you ask for this number as an Int64 or UInt64, you will get the full 
number without loss of precision.

— Itai

> On May 9, 2017, at 9:27 AM, Johannes Weiss via swift-evolution 
>  wrote:
> 
> Hi,
> 
> Sorry, I'm very late to the party but have one quick question that I think 
> should be resolved/documented before the patch is landed:
> 
> What do we do with integers outside the range [-(2**53)+1, (2**53)-1])? Those 
> are which are the integers that are precisely representable by doubles (IEEE 
> 754-2008 binary64 (double precision). And the problem is that at least 
> JavaScript treats all numbers as doubles, which leads to this problem:
> 
> [9851624185071827, 9851624185071829] as [Double] makes them
> [9851624185071828, 9851624185071828]
> 
> (so two different numbers get both mapped to a third number which sometimes 
> causes problems in the real world [4])
> 
> 
> The I(nternet)-JSON RFC [1] states that
> 
>   Implementations that generate I-JSON messages cannot assume that
>   receiving implementations can process numeric values with greater
>   magnitude or precision than provided by those numbers.
> 
> Now since Swift isn't JavaScript we fortunately don't store all numbers as 
> doubles so I'm sure a roundtrip of the number 9851624185071827 (which is 
> outside that range) will just work. Nevertheless the RFC [1] says
> 
>   For applications that require the exact interchange of numbers with
>   greater magnitude or precision, it is RECOMMENDED to encode them in
>   JSON string values.
> 
> I'm not sure if following that recommendation is a good idea but in any case 
> I think it would be worth documenting it. Other encoders sometimes allow you 
> to specify 'numbers as strings' as an option [2] or outright refuse to encode 
> it.
> 
> Twitter also covers the subject [3] for its API.
> 
> -- 
>  Johannes
> 
> [1]: https://tools.ietf.org/html/rfc7493#section-2.2
> [2]: 
> https://fasterxml.github.io/jackson-core/javadoc/2.4/com/fasterxml/jackson/core/JsonGenerator.Feature.html#WRITE_NUMBERS_AS_STRINGS
> [3]: https://dev.twitter.com/overview/api/twitter-ids-json-and-snowflake
> [4]: https://github.com/nodejs/node/issues/12115
> 
>> On 26 Apr 2017, at 12:10 am, Douglas Gregor via swift-evolution 
>>  wrote:
>> 
>> Proposal Link: 
>> https://github.com/apple/swift-evolution/blob/master/proposals/0167-swift-encoders.md
>> 
>> The review of SE-0167 "SE-0167: Swift Encoders” ran from April 6...12, 2017. 
>> The proposal is accepted. Thanks to everyone who participated in the review!
>> 
>>  - Doug
>>  Review Manager
>> 
>> 
>>  
>> ___
>> swift-evolution mailing list
>> swift-evolution@swift.org
>> https://lists.swift.org/mailman/listinfo/swift-evolution
> 
> ___
> swift-evolution mailing list
> swift-evolution@swift.org
> https://lists.swift.org/mailman/listinfo/swift-evolution

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Accepted] SE-0166: Swift Archival & Serialization

2017-05-01 Thread Itai Ferber via swift-evolution
If you need multiple different representations for multiple different 
formats, then yes, you will likely want to supply different `CodingKeys` 
enums for those formats and write a custom encode which switches on the 
format you're writing to. Or, you can use one `CodingKeys` enum which 
has multiple different key representations (for the different formats 
you want to support) and use that.


Automatic key renaming is an inherently unsafe operation, so it's not 
something that we want to provide out of the box, or encourage, but it 
should be possible if you really want it.


On 1 May 2017, at 13:04, Anders Ha wrote:

I do mean optional automatic translation on the encoders’ and 
decoders’ end though. Derived conformances from other common naming 
conventions are nice to have, but does not help if one wants to define 
a model that can be decoded from and encoded into different coded 
representations of itself. It also “hardwired" the model with a 
specific coding scheme, which does not seem very nice from an 
encapsulation PoV.


That said I am fairly sure there would be third party libraries 
providing key mapping/transformation, if Foundation’s encoders and 
decoders do not provide such functionality. So it is probably not a 
big deal.


Regards
Anders


On 2 May 2017, at 3:06 AM, Itai Ferber  wrote:

Sorry, one clarifying statement: it's not the JSON encoder and 
decoder that will be providing this renaming (since they should be 
encoding the key values they are given as-is); the key names are part 
of the definition of the enum, as declared as part of the type.


On 1 May 2017, at 12:04, Itai Ferber via swift-evolution wrote:

Yes, this should be true for most types.
The compiler derives conformance based on a nested CodingKeys type 
within your Codable type. If you do not supply one, it will derive 
one on your behalf, but if you do provide one, making a naming 
transition like this is trivial:


public struct Post
 : Codable {

let authorID: Int


let authorName: String


let bodyText: String



private enum CodingKeys: String
, CodingKey {

case authorID = "author_id"


case authorName = "author_name"


case bodyText = "body_text"

}


// init(from:) and encode(to:) are still automatically generated

}

This is something we wanted to explicitly support as we don't want 
users to have to violate Swift naming guidelines, and is a step along 
the progressive disclosure of the API that we want to provide.
As long as the case names of the enum match 1-to-1 with property 
names, derived conformance still applies.


On 1 May 2017, at 11:39, Anders Ha wrote:


I thought it would be quite trivial to have the JSON encoder and 
decoder transforming the keys between camel case and snake case, 
wouldn't it?



Regards
Anders

On 2 May 2017, at 1:57 AM, Jon Shier via swift-evolution 
 wrote:


	FYI, I’d give the derived implementations a very low chance of 
ever being used for JSON. Unless developers violate the Swift naming 
guidelines for their properties at least or they don’t have 
properties with multiword keys.
	Once this functionality has landed for the Swift 4 branch, I plan 
to implement some of the tricky JSON types I had to decode on a 
recent project, just to see how painful the custom syntax will be, 
and to compare it to my existing Argo implementation. Hopefully 
there will be time for at least one round of feedback to be 
integrated into this functionality.




Jon


On May 1, 2017, at 12:54 PM, Itai Ferber via swift-evolution 
 wrote:


Hi Goffredo,

On Apr 26, 2017, at 2:00 PM, Goffredo Marocchi  
wrote:


Hello Itai,

Sorry for the confusion, but I understood that the following

To answer your second question, the reason is that using the 
protocol implies that all encoders and decoders must support 
anything that conforms to that protocol. We’re not sure this is 
a reasonable requirement. Many formats do not have any kind of 
support for arbitrary size integers, for example. Therefore, we 
felt it was best to limit it to a set of concrete types.


meant it would actually hinder that kind of transformation or make 
it more difficult to write custom decoders and encoders. Sorry if 
I misunderstood that.


One follow up question: what would happen if inside the JSON mock 
object you posted I were to remove the 'address' key (in terms of 
the produced object and how to access its inner properties)?


What would happen if I remove the 'name' one or better if I add 
another key to the JSON object?
Codable conformance is derived by default to require that all 
non-optional properties be initialized. This means that if you have 
a non-optional property address: Address but there is no address 
key in the JSON payload you're decoding from, it will throw an 
error to indicate that the key was not found.
On the flip side, if the JSON payload has information in it which 
your type does not have (e.g. if there 

Re: [swift-evolution] [Accepted] SE-0166: Swift Archival & Serialization

2017-05-01 Thread Itai Ferber via swift-evolution
Sorry, one clarifying statement:  it's not the JSON encoder and decoder 
that will be providing this renaming (since they should be encoding the 
key values they are given as-is); the key names are part of the 
definition of the enum, as declared as part of the type.


On 1 May 2017, at 12:04, Itai Ferber via swift-evolution wrote:


Yes, this should be true for most types.
The compiler derives conformance based on a nested `CodingKeys` type 
within your `Codable` type. If you do not supply one, it will derive 
one on your behalf, but if you do provide one, making a naming 
transition like this is trivial:


```swift
public struct Post : Codable {
let authorID: Int
let authorName: String
let bodyText: String

private enum CodingKeys: String, CodingKey {
case authorID = "author_id"
case authorName = "author_name"
case bodyText = "body_text"
}

// init(from:) and encode(to:) are still automatically generated
}
```

This is something we wanted to explicitly support as we don't want 
users to have to violate Swift naming guidelines, and is a step along 
the progressive disclosure of the API that we want to provide.
As long as the case names of the enum match 1-to-1 with property 
names, derived conformance still applies.


On 1 May 2017, at 11:39, Anders Ha wrote:

I thought it would be quite trivial to have the JSON encoder and 
decoder transforming the keys between camel case and snake case, 
wouldn't it?



Regards
Anders


On 2 May 2017, at 1:57 AM, Jon Shier via swift-evolution 
 wrote:


	FYI, I’d give the derived implementations a very low chance of 
ever being used for JSON. Unless developers violate the Swift naming 
guidelines for their properties at least or they don’t have 
properties with multiword keys.
	Once this functionality has landed for the Swift 4 branch, I plan 
to implement some of the tricky JSON types I had to decode on a 
recent project, just to see how painful the custom syntax will be, 
and to compare it to my existing Argo implementation. Hopefully 
there will be time for at least one round of feedback to be 
integrated into this functionality.




Jon


On May 1, 2017, at 12:54 PM, Itai Ferber via swift-evolution 
mailto:swift-evolution@swift.org>> 
wrote:


Hi Goffredo,

On Apr 26, 2017, at 2:00 PM, Goffredo Marocchi <mailto:pana...@gmail.com>> wrote:


Hello Itai,

Sorry for the confusion, but I understood that the following

To answer your second question, the reason is that using the 
protocol implies that all encoders and decoders must support 
anything that conforms to that protocol. We’re not sure this is 
a reasonable requirement. Many formats do not have any kind of 
support for arbitrary size integers, for example. Therefore, we 
felt it was best to limit it to a set of concrete types.


meant it would actually hinder that kind of transformation or make 
it more difficult to write custom decoders and encoders. Sorry if 
I misunderstood that.


One follow up question: what would happen if inside the JSON mock 
object you posted I were to remove the 'address' key (in terms of 
the produced object and how to access its inner properties)?


What would happen if I remove the 'name' one or better if I add 
another key to the JSON object?
Codable conformance is derived by default to require that all 
non-optional properties be initialized. This means that if you have 
a non-optional property address: Address but there is no address 
key in the JSON payload you're decoding from, it will throw an 
error to indicate that the key was not found.
On the flip side, if the JSON payload has information in it which 
your type does not have (e.g. if there is an address in the JSON, 
but your Person just has name), the extra data is ignored.


This, however, is just in the default, derived conformance. For 
more complex cases, you can always provide your own init(from:) and 
encode(to:)to do custom decoding. If you have a property which may 
or may not be in the JSON, you can always decodeIfPresent, which 
will return nil if the key or value was not found.
If you need to access sub-objects in the JSON data which do not map 
to your properties 1-to-1, e.g. your payload looks like {"name": 
"John Doe", "address": { "street": "1 Infinite Loop", ... } }, but 
your type looks like

struct Person {
let name: String
let street: String
let city: String
// ...
}
then you can always access the nested data by requesting a 
nestedContainer(keyedBy: ..., forKey: .address) which will return a 
container wrapping the address sub-object, which you can then pull 
fields out of.


The derived conformance case gives a reasonable default, but you 
can always write your own init(from:) and encode(to:) to handle 
custom needs.




Sent from my iPhone

On 26 Apr 2017, at 21:28, Itai Ferber <mailto:ifer...@apple.c

Re: [swift-evolution] [Accepted] SE-0166: Swift Archival & Serialization

2017-05-01 Thread Itai Ferber via swift-evolution

Yes, this should be true for most types.
The compiler derives conformance based on a nested `CodingKeys` type 
within your `Codable` type. If you do not supply one, it will derive one 
on your behalf, but if you do provide one, making a naming transition 
like this is trivial:


```swift
public struct Post : Codable {
let authorID: Int
let authorName: String
let bodyText: String

private enum CodingKeys: String, CodingKey {
case authorID = "author_id"
case authorName = "author_name"
case bodyText = "body_text"
}

// init(from:) and encode(to:) are still automatically generated
}
```

This is something we wanted to explicitly support as we don't want users 
to have to violate Swift naming guidelines, and is a step along the 
progressive disclosure of the API that we want to provide.
As long as the case names of the enum match 1-to-1 with property names, 
derived conformance still applies.


On 1 May 2017, at 11:39, Anders Ha wrote:

I thought it would be quite trivial to have the JSON encoder and 
decoder transforming the keys between camel case and snake case, 
wouldn't it?



Regards
Anders


On 2 May 2017, at 1:57 AM, Jon Shier via swift-evolution 
 wrote:


	FYI, I’d give the derived implementations a very low chance of 
ever being used for JSON. Unless developers violate the Swift naming 
guidelines for their properties at least or they don’t have 
properties with multiword keys.
	Once this functionality has landed for the Swift 4 branch, I plan to 
implement some of the tricky JSON types I had to decode on a recent 
project, just to see how painful the custom syntax will be, and to 
compare it to my existing Argo implementation. Hopefully there will 
be time for at least one round of feedback to be integrated into this 
functionality.




Jon


On May 1, 2017, at 12:54 PM, Itai Ferber via swift-evolution 
mailto:swift-evolution@swift.org>> 
wrote:


Hi Goffredo,

On Apr 26, 2017, at 2:00 PM, Goffredo Marocchi <mailto:pana...@gmail.com>> wrote:


Hello Itai,

Sorry for the confusion, but I understood that the following

To answer your second question, the reason is that using the 
protocol implies that all encoders and decoders must support 
anything that conforms to that protocol. We’re not sure this is 
a reasonable requirement. Many formats do not have any kind of 
support for arbitrary size integers, for example. Therefore, we 
felt it was best to limit it to a set of concrete types.


meant it would actually hinder that kind of transformation or make 
it more difficult to write custom decoders and encoders. Sorry if I 
misunderstood that.


One follow up question: what would happen if inside the JSON mock 
object you posted I were to remove the 'address' key (in terms of 
the produced object and how to access its inner properties)?


What would happen if I remove the 'name' one or better if I add 
another key to the JSON object?
Codable conformance is derived by default to require that all 
non-optional properties be initialized. This means that if you have 
a non-optional property address: Address but there is no address key 
in the JSON payload you're decoding from, it will throw an error to 
indicate that the key was not found.
On the flip side, if the JSON payload has information in it which 
your type does not have (e.g. if there is an address in the JSON, 
but your Person just has name), the extra data is ignored.


This, however, is just in the default, derived conformance. For more 
complex cases, you can always provide your own init(from:) and 
encode(to:)to do custom decoding. If you have a property which may 
or may not be in the JSON, you can always decodeIfPresent, which 
will return nil if the key or value was not found.
If you need to access sub-objects in the JSON data which do not map 
to your properties 1-to-1, e.g. your payload looks like {"name": 
"John Doe", "address": { "street": "1 Infinite Loop", ... } }, but 
your type looks like

struct Person {
let name: String
let street: String
let city: String
// ...
}
then you can always access the nested data by requesting a 
nestedContainer(keyedBy: ..., forKey: .address) which will return a 
container wrapping the address sub-object, which you can then pull 
fields out of.


The derived conformance case gives a reasonable default, but you can 
always write your own init(from:) and encode(to:) to handle custom 
needs.




Sent from my iPhone

On 26 Apr 2017, at 21:28, Itai Ferber <mailto:ifer...@apple.com>> wrote:



Hi Goffredo,

Unless I'm misunderstanding what you mean here, this is exactly 
what we're proposing with the API — anything Encodable can 
encode any type that is Encodable as a nested value:


struct Person : Codable {
let name: String
let address: Address
}

struct Address : Codable {
let str

Re: [swift-evolution] [Accepted] SE-0166: Swift Archival & Serialization

2017-05-01 Thread Itai Ferber via swift-evolution
Hi Goffredo,

> On Apr 26, 2017, at 2:00 PM, Goffredo Marocchi  wrote:
> 
> Hello Itai,
> 
> Sorry for the confusion, but I understood that the following
> 
>> To answer your second question, the reason is that using the protocol 
>> implies that all encoders and decoders must support anything that conforms 
>> to that protocol. We’re not sure this is a reasonable requirement. Many 
>> formats do not have any kind of support for arbitrary size integers, for 
>> example. Therefore, we felt it was best to limit it to a set of concrete 
>> types.
> 
> meant it would actually hinder that kind of transformation or make it more 
> difficult to write custom decoders and encoders. Sorry if I misunderstood 
> that.
> 
> One follow up question: what would happen if inside the JSON mock object you 
> posted I were to remove the 'address' key (in terms of the produced object 
> and how to access its inner properties)? 
> 
> What would happen if I remove the 'name' one or better if I add another key 
> to the JSON object?
Codable conformance is derived by default to require that all non-optional 
properties be initialized. This means that if you have a non-optional property 
address: Address but there is no address key in the JSON payload you're 
decoding from, it will throw an error to indicate that the key was not found.
On the flip side, if the JSON payload has information in it which your type 
does not have (e.g. if there is an address in the JSON, but your Person just 
has name), the extra data is ignored.

This, however, is just in the default, derived conformance. For more complex 
cases, you can always provide your own init(from:) and encode(to:) to do custom 
decoding. If you have a property which may or may not be in the JSON, you can 
always decodeIfPresent, which will return nil if the key or value was not found.
If you need to access sub-objects in the JSON data which do not map to your 
properties 1-to-1, e.g. your payload looks like {"name": "John Doe", "address": 
{ "street": "1 Infinite Loop", ... } }, but your type looks like
struct Person {
let name: String
let street: String
let city: String
// ...
}
then you can always access the nested data by requesting a 
nestedContainer(keyedBy: ..., forKey: .address) which will return a container 
wrapping the address sub-object, which you can then pull fields out of.

The derived conformance case gives a reasonable default, but you can always 
write your own init(from:) and encode(to:) to handle custom needs.

> 
> Sent from my iPhone
> 
> On 26 Apr 2017, at 21:28, Itai Ferber  > wrote:
> 
>> Hi Goffredo,
>> 
>> Unless I'm misunderstanding what you mean here, this is exactly what we're 
>> proposing with the API — anything Encodable can encode any type that is 
>> Encodable as a nested value:
>> 
>> struct Person : Codable {
>> let name: String
>> let address: Address
>> }
>> 
>> struct Address : Codable {
>> let street: String
>> let city: String
>> let state: String
>> let zipCode: Int
>> let country: String
>> }
>> 
>> let address = Address(street: "1 Infinite Loop", city: "Cupertino", state: 
>> "CA", zipCode: 95014, country: "United States")
>> let person = Person(name: "John Doe", address: address)
>> 
>> let encoder = JSONEncoder()
>> let payload = try encoder.encode(person)
>> print(String(data: payload, encoding: .utf8)!) // => {"name": "John Doe", 
>> address: {"street": "1 Infinite Loop", ... } }
>> 
>> let decoder = JSONDecoder()
>> let decoded = try decoder.decode(Person.self, from: payload) // => 
>> Person(name: "John Doe", address: ...)
>> Or have I misunderstood you?
>> 
>> — Itai
>> 
>> On 26 Apr 2017, at 13:11, Goffredo Marocchi via swift-evolution wrote:
>> 
>> 
>> 
>> 
>> Sent from my iPhone
>> 
>> On 26 Apr 2017, at 17:24, Tony Parker via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
>>> Hi Riley,
>>> 
 On Apr 25, 2017, at 6:11 PM, Riley Testut via swift-evolution 
 mailto:swift-evolution@swift.org>> wrote:
 
 I’m sure this has already been discussed, but why are the methods throwing 
 NSErrors and not Enums? If I’m remembering correctly, the original reason 
 for this was because this was meant to be a part of Foundation. Now that 
 this is in the Standard Library, however, it seems strange that we’re 
 still using NSError.
 
 Second question that again I’m sure was asked and answered already, but: 
 why do we require implementations for each concrete numeric type (Int, 
 Int8, Int16, Float, etc), instead of using protocols (such as the new 
 Integer protocols)?
>>> 
>>> To answer your second question, the reason is that using the protocol 
>>> implies that all encoders and decoders must support anything that conforms 
>>> to that protocol.
>> 
>> Would this make it easier to transform nested JSON into a nested 
>> object/struct? If so it could be useful, very useful.
>> 
>>> We’re not sure this 

Re: [swift-evolution] [Accepted] SE-0166: Swift Archival & Serialization

2017-04-26 Thread Itai Ferber via swift-evolution

Hi Goffredo,

Unless I'm misunderstanding what you mean here, this is exactly what 
we're proposing with the API — anything `Encodable` can encode any 
type that is `Encodable` as a nested value:


```swift
struct Person : Codable {
let name: String
let address: Address
}

struct Address : Codable {
let street: String
let city: String
let state: String
let zipCode: Int
let country: String
}

let address = Address(street: "1 Infinite Loop", city: "Cupertino", 
state: "CA", zipCode: 95014, country: "United States")

let person = Person(name: "John Doe", address: address)

let encoder = JSONEncoder()
let payload = try encoder.encode(person)
print(String(data: payload, encoding: .utf8)!) // => {"name": "John 
Doe", address: {"street": "1 Infinite Loop", ... } }


let decoder = JSONDecoder()
let decoded = try decoder.decode(Person.self, from: payload) // => 
Person(name: "John Doe", address: ...)

```

Or have I misunderstood you?

— Itai

On 26 Apr 2017, at 13:11, Goffredo Marocchi via swift-evolution wrote:


Sent from my iPhone

On 26 Apr 2017, at 17:24, Tony Parker via swift-evolution 
 wrote:


Hi Riley,

On Apr 25, 2017, at 6:11 PM, Riley Testut via swift-evolution 
 wrote:


I’m sure this has already been discussed, but why are the methods 
throwing NSErrors and not Enums? If I’m remembering correctly, the 
original reason for this was because this was meant to be a part of 
Foundation. Now that this is in the Standard Library, however, it 
seems strange that we’re still using NSError.


Second question that again I’m sure was asked and answered 
already, but: why do we require implementations for each concrete 
numeric type (Int, Int8, Int16, Float, etc), instead of using 
protocols (such as the new Integer protocols)?


To answer your second question, the reason is that using the protocol 
implies that all encoders and decoders must support anything that 
conforms to that protocol.


Would this make it easier to transform nested JSON into a nested 
object/struct? If so it could be useful, very useful.


We’re not sure this is a reasonable requirement. Many formats do 
not have any kind of support for arbitrary size integers, for 
example. Therefore, we felt it was best to limit it to a set of 
concrete types.




I honk we would be missing a trick, unless I am missing something 
here, that was very powerful in libraries like Mantle for iOS: the 
ability to translate a nested JSON object (some keys in the JSON 
object having a JSON object as value, etc...) in an MTLModel subclass 
composed of other MTLModel subclasses where doing the transformation 
of the root object would call the right model needed to transform for 
the child JSON objects.
Working with Mantle is safe, rugged (it does not cause crashes if the 
JSON file changes), and allows you to break the problem into chunks 
and present a coherent simple view to the code that makes use of the 
instance you created out of the JSON input. Reference: 
https://github.com/Mantle/Mantle/blob/master/README.md



We could change our minds on this before we ship Swift 4, if we feel 
it was the wrong decision. Now that the proposals are accepted we 
will be landing these branches in master soon, which means everyone 
has a great chance to try it out and see how it feels in real world 
usage before it’s final.


- Tony



On Apr 25, 2017, at 3:59 PM, Douglas Gregor via swift-evolution 
 wrote:


Proposal Link: 
https://github.com/apple/swift-evolution/blob/master/proposals/0166-swift-archival-serialization.md


Hello Swift Community,

The review of SE-0166 “Swift Archival & Serialization” ran from 
April 6...12, 2017. The proposal is accepted with some minor 
modifications. Specifically, the core protocols and types will be 
sunk down into the Swift standard library for more tight 
integration with the Swift language and compiler, and the 
operations specifically involving Foundation’s “Data” type 
will be removed. The proposal document has been updated with more 
detail. Thank you everyone for participating in this review!


- Doug
Review Manager

___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-04-05 Thread Itai Ferber via swift-evolution

> On Apr 5, 2017, at 2:29 PM, Brent Royal-Gordon via swift-evolution 
>  wrote:
> 
>> On Apr 5, 2017, at 1:44 PM, David Hart via swift-evolution 
>> mailto:swift-evolution@swift.org>> wrote:
>> 
 For the same reasons, I continue to believe that decode functions should 
 overload on the return type. If we follow the arguments in favor of 
 providing a type argument, then why don't we also have type arguments for 
 encoders: encode(_ value: T?, forKey key: Key, as type: T.self)? I'm not 
 advocating that: I'm just pushing the argument to its logical conclusion 
 to explain why I don't understand it.
>>> 
>>> I don’t see a way for a call to encode to become ambiguous by omitting the 
>>> type argument, whereas the same is not true for a return value from decode. 
>>> The two seem fundamentally different.
>> 
>> When decoding to a property, there will be no ambiguity. And for other 
>> cases, Swift developers are already quite used to handling that kind of 
>> ambiguity, like for literals:
>> 
>> let x: UInt = 10
>> let y = 20 as CGFloat
> 
> But in the literal case, they *don't* have to deal with ambiguity for two 
> reasons:
> 
> 1. The literal provides some hint of the type; integer, float, string, array, 
> and dictionary literals are all easy to distinguish from one another.
> 
> 2. Each literal syntax has a default type. That is not true and cannot *be* 
> true for `decode()`.
> 
> Plus there's a third reason:
> 
> 3. `Decoder` doesn't guarantee there's a safety net if you use the wrong 
> type. If you, say, decode an `Int32` using `Int64`, a decoder for some 
> low-level binary type would be perfectly within its rights to read part of 
> the next field, access everything subsequent to that point in a misaligned 
> way, and go totally off the rails (as long as it doesn't violate memory 
> safety).
> 
> That third reason is exactly the same as why `unsafeBitCast(_:to:)`, 
> `bindMemory(to:capacity:)`, etc. all have a type-pinning parameter. Although 
> Swift places no restrictions on return-type inference, in practice the core 
> team thinks unconstrained return types are dangerous and should be used with 
> care, only permitted when an API explicitly exists to ease conversions 
> between different types. (`numericCast(_:)` is one example; I'm not sure if 
> there are any others.) That's just their opinion, and of course you are 
> always free to disagree with them, but I think it's a solid and easily 
> justified one.
Well put; this is basically what I was going to say.

> Besides, if you really want this, it's easy to add with a pair of extensions:
> 
>   extension KeyedDecodingContainer {
>   func decode(forKey key: Key) throws -> T {
>   return try decode(T.self, forKey: key)
>   }
>   }
>   extension UnkeyedDecodingContainer {
>   func decode() throws -> T {
>   return try decode(T.self)
>   }
>   }
> 
> (P.S. There might be a way to square this circle: If a CodingKey knew its 
> type, the mere act of providing a CodingKey would be enough to pin the type. 
> This would not only avoid both an explicit type-pinning parameter *and* 
> unconstrained generic return types, it would also prevent you from 
> accidentally specifying the wrong type during decoding. Rough example:
> 
>   protocol CodingKey {
>   associatedtype Value: Encodable
>   
>   var stringValue: String { get }
>   var intValue: Int? { get }
>   
>   init(stringValue: String, intValue: Int?, as _: Value.Type)
>   }
>   extension CodingKey {
>   var intValue: Int { return nil }
>   }
>   
>   extension Person: Decodable {
>   struct CodingKeys: CodingKey {
>   let stringValue: String
>   
>   init(stringValue: String, intValue: Int? = nil, as _: 
> Value.Type) {
>   self.stringValue = stringValue
>   }
>   
>   static let name = CodingKeys(stringValue: "name", as: 
> String.self)
>   static let age = CodingKeys(stringValue: "age", as: 
> Int.self)
>   static let pets = CodingKeys(stringValue: "pets", as: 
> [Pet].self)
>   }
>   
>   init(from decoder: Decoder) throws {
>   let c = try decoder.container(keyedBy: CodingKeys.self)
>   
>   name = c.decode(.name)
>   age = c.decode(.age)
>   pets = c.decode(.pets)
>   }
>   }
> 
> But this doesn't work for two reasons: You can't pass an entire generic type 
> to `container(keyedBy:)` and you can't put constants in a generic type. You 
> also lose the ability to construct a CodingKey from a String or Int, you lose 

Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-04-05 Thread Itai Ferber via swift-evolution

> On Apr 4, 2017, at 6:18 PM, Brent Royal-Gordon  wrote:
> 
>> On Apr 4, 2017, at 2:43 PM, Itai Ferber > > wrote:
>> I like the separation between keyed and unkeyed containers (and I think 
>> "unkeyed" is a good name, though not perfect), but I'm not quite happy with 
>> the unkeyed container API. Encoding a value into an unkeyed container 
>> appends it to the container's end; decoding a value from an unkeyed 
>> container removes it from the container's front. These are very important 
>> semantics that the method names in question do not imply at all.
>> 
>> I think that consistency of phrasing is really important here, and the 
>> action words "encode" and "decode" are even more important to connote than 
>> the semantics of pushing and popping.
>> (Note that there need not be specific directionality to an unkeyed container 
>> as long as the ordering of encoded items is eventually maintained on 
>> decode.) But on a practical note, names like encodeAtEnd and decodeFromFront 
>> (or similar) don't feel like they communicate anything much more useful than 
>> the current encode/decode.
>> 
> Yeah; I stopped short of suggesting specific names because I wasn't totally 
> happy with the ones I could think of. (`appendEncoded` and `decodeNext` were 
> my best ideas, but they don't make a matched pair.)
Your suggestion in the other thread (which I’m getting to) of encodeNext and 
decodeNext sound better, but we’ll really need to weigh whether the word "next" 
adds something significant over not having it at all.
>> Certain aspects of `UnkeyedDecodingContainer` also feel like they do the 
>> same things as `Sequence` and `IteratorProtocol`, but in different and 
>> incompatible ways. And I certainly think that the `encode(contentsOf:)` 
>> methods on `UnkeyedEncodingContainer` could use equivalents on the 
>> `UnkeyedDecodingContainer`. Still, the design in this area is much improved 
>> compared to the previous iteration.
>> 
>> Which aspects of Sequence and IteratorProtocol do you feel like you're 
>> missing on UnkeyedDecodingContainer? Keep in mind that methods on 
>> UnkeyedDecodingContainer must be able to throw, and an 
>> UnkeyedDecodingContainer can hold heterogeneous items whose type is not 
>> known, two things that Sequence and IteratorProtocol do not do.
>> 
> Yeah, that's true. One possibility is to use a closure-scoped block which 
> gets passed a sequence that terminates early if it encounters an error:
> 
>   self.pets = try encoder.withDecoded(Pet.self) { seq in
>   return Array(seq)
>   // If there is an error, `seq` terminates early, and once we 
> return control from 
>   // this closure, `withSequence` will throw it.
>   }
> 
> But that's awkward in a couple of different ways.
Agreed
>> In terms of an equivalent to encode(contentsOf:), keep in mind that this 
>> would only work if the collection you're decoding is homogeneous, in which 
>> case, you would likely prefer to decode an Array over getting an unkeyed 
>> container, no? (As soon as conditional conformance arrives in Swift, we will 
>> be able to express extension Array : Decodable where Element : Decodable { 
>> ... } making decoding homogeneous arrays trivial.)
>> 
> That's true (and I assumed that `Array`s and friends would be `Codable`—we 
> don't even need to wait for conditional conformances ), but it's hardly 
> unheard of to write your own `Collection` types when you need different 
> semantics.
> 
> Swift has two mutation protocols that are important for this purpose: 
> `RangeReplaceableCollection` and `SetAlgebra`. You could provide methods on 
> `UnkeyedDecodingContainer` that work with them:
> 
>   func decodeAll(_ type: C.Type) throws -> 
> C where C.Iterator.Element: Encodable {
>   var collection = C()
>   if let capacity = self.count {
>   collection.reserveCapacity(capacity)
>   }
>   while !self.isAtEnd {
>   collection.append(try 
> self.decode(C.Iterator.Element.self))
>   }
>   }
> 
>   func decodeAll(_ type: S.Type) throws -> S where 
> S.Element: Encodable {
>   var set = S()
>   while !self.isAtEnd {
>   set.insert(try self.decode(C.Iterator.Element.self))
>   }
>   }
> 
>   // Usage:
>   let array = container.decodeAll(ContiguousArray.self)
>   let set = container.decodeAll(Set.self)
Potentially, thought it would be ambiguous if you had a collection that 
conformed to both.

> Alternatively, you could take a more informal approach, and simply take a 
> function which takes a sequence and returns a value constructed from it. Most 
> collection types have an initializer like that. (Interestingly, this is 
> basically the same as `withDecoded` above, just with a different signature.)
> 
>   // I'm showing this as public, but you could hide it behind a

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-04-04 Thread Itai Ferber via swift-evolution

Hi Zach,

Thanks for your comments!
The type is called "unkeyed", but I assume "nonkeyed" was a typo and 
that's what you meant. As far as the phrasing of "ordered" and 
"sequential", both sound good, but:


1. The symmetry between "keyed" and "unkeyed" is helpful in creating 
opposition between types of encoding (and especially so in comparison to 
"single value", which is the odd man out — and you'd extremely rarely 
need to interact with it)
2. Given something that's "x" or "not x", you'd generally gravitate 
toward the thing with the more positive phrasing. As you mention, we 
really want to encourage keyed containers and diminish the use of 
unkeyed containers unless truly necessary, because they're fragile. The 
problem is, it's much easier to use the unkeyed containers — 
especially accidentally as a novice, since they're much simpler API — 
and I think "ordered" and "sequential" don't go far enough to detract 
from their usage.


They sound good, and in fact, too good, and we find that more negative 
phrasing is helpful.


— Itai

On 3 Apr 2017, at 16:01, Zach Waldowski via swift-evolution wrote:


Itai and co:



This is a solid improvement.



I think it's appropriate to diminish the importance of non-keyed
containers. "Nonkeyed" as the name is pretty iffy to me, though, even
though I admit it makes the use case pretty clear. "Ordered" or
"Sequential" both sound fine, even for an encoder that's slot-based
instead of  NSArchiver-like model. An array is ordered but you don't
have to traverse it in order.


Best,

  Zachary Waldowski

  z...@waldowski.me





On Mon, Apr 3, 2017, at 04:31 PM, Itai Ferber via swift-evolution 
wrote:

Hi everyone,





With feedback from swift-evolution and additional internal review,
we've pushed updates to this proposal, and to the Swift Encoders[1]
proposal. In the interest of not blowing up mail clients with the 
full

HTML again, I'll simply be linking to the swift-evolution PR here[2],
as well as the specific diff[3] of what's changed.
At a high level:



 * The Codable protocol has been split up into Encodable and 
Decodable

 * String keys on CodingKey are no longer optional
 * KeyedEncodingContainer has become
   KeyedEncodingContainerProtocol, with a concrete type-erased
   KeyedEncodingContainer struct to hold it
 * Array responsibilities have been removed from
   KeyedEncodingContainer, and have been added to a new
   UnkeyedEncodingContainer type
 * codingKeyContext has been renamed codingPath
There are some specific changes inline — I know it might be a bit 
of a
pain, but let's keep discussion here on the mailing list instead of 
on

GitHub. We'll be looking to start the official review process very
soon, so we're interested in any additional feedback.
Thanks!





— Itai





_



swift-evolution mailing list



swift-evolution@swift.org



https://lists.swift.org/mailman/listinfo/swift-evolution





Links:

  1. https://github.com/apple/swift-evolution/pull/640
  2. https://github.com/apple/swift-evolution/pull/639
  3. 
https://github.com/apple/swift-evolution/pull/639/commits/d619eef9166f8b45ffac152d06376cbdab536241




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-04-04 Thread Itai Ferber via swift-evolution

Hi Brent,

Thanks for your comments and thorough review! :)
Responses inline.

On 4 Apr 2017, at 1:57, Brent Royal-Gordon wrote:

On Apr 3, 2017, at 1:31 PM, Itai Ferber via swift-evolution 
 wrote:

Hi everyone,

With feedback from swift-evolution and additional internal review, 
we've pushed updates to this proposal, and to the Swift Archival & 
Serialization proposal.
Changes to here mostly mirror the ones made to Swift Archival & 
Serialization, but you can see a specific diff of what's changed 
here. Full content below.


We'll be looking to start the official review process very soon, so 
we're interested in any additional feedback.


Thanks!

— Itai


This is a good revision to a good proposal.

I'm glad `CodingKey`s now require `stringValue`s; I think the intended 
semantics are now a lot clearer, and key behavior will be much more 
reliable.

Agreed

I like the separation between keyed and unkeyed containers (and I 
think "unkeyed" is a good name, though not perfect), but I'm not quite 
happy with the unkeyed container API. Encoding a value into an unkeyed 
container appends it to the container's end; decoding a value from an 
unkeyed container removes it from the container's front. These are 
very important semantics that the method names in question do not 
imply at all.
I think that consistency of phrasing is really important here, and the 
action words "encode" and "decode" are even more important to connote 
than the semantics of pushing and popping.
(Note that there need not be specific directionality to an unkeyed 
container as long as the ordering of encoded items is eventually 
maintained on decode.) But on a practical note, names like `encodeAtEnd` 
and `decodeFromFront` (or similar) don't feel like they communicate 
anything much more useful than the current `encode`/`decode`.


Certain aspects of `UnkeyedDecodingContainer` also feel like they do 
the same things as `Sequence` and `IteratorProtocol`, but in different 
and incompatible ways. And I certainly think that the 
`encode(contentsOf:)` methods on `UnkeyedEncodingContainer` could use 
equivalents on the `UnkeyedDecodingContainer`. Still, the design in 
this area is much improved compared to the previous iteration.
Which aspects of `Sequence` and `IteratorProtocol` do you feel like 
you're missing on `UnkeyedDecodingContainer`? Keep in mind that methods 
on `UnkeyedDecodingContainer` must be able to throw, and an 
`UnkeyedDecodingContainer` can hold heterogeneous items whose type is 
not known, two things that `Sequence` and `IteratorProtocol` do not do.


In terms of an equivalent to `encode(contentsOf:)`, keep in mind that 
this would only work if the collection you're decoding is homogeneous, 
in which case, you would likely prefer to decode an `Array` over getting 
an unkeyed container, no? (As soon as conditional conformance arrives in 
Swift, we will be able to express `extension Array : Decodable where 
Element : Decodable { ... }` making decoding homogeneous arrays 
trivial.)


(Tiny nitpick: I keep finding myself saying "encode into", not "encode 
to" as the API name suggests. Would that be a better parameter label?)
On a personal note here — I agree with you, and had originally used 
"into". However, we've reviewed our APIs and more often have balanced 
`from:/to:` rather than `from:/into:` on read/write/streaming calls. 
We'd like to rein these in a bit and keep them consistent within our 
naming guidelines, as much as possible.


I like the functionality of the `userInfo` dictionary, but I'm still 
not totally satisfied casting out of `Any` all the time. I might just 
have to get over that, though.
I think this is the closest we can get to a pragmatic balance between 
dynamic needs and static guarantees. :)


I wonder if `CodingKey` implementations might ever need access to the 
`userInfo`. I suppose you can just switch to a different set of 
`CodingKeys` if you do.
I don't think `CodingKey` should ever know about `userInfo` — 
`CodingKey`s should be inert data. If you need to, use the `userInfo` to 
switch to a different set of keys, as you mention.


Should there be a way for an `init(from:)` implementation to determine 
the type of container in the encoder it's just been handed? Or perhaps 
the better question is, do we want to promise users that all decoders 
can tell the difference?
I think it would be very rare to need this type of information. If a 
type wants to encode as an array or as a dictionary conditionally, the 
context for that would likely be present in `userInfo`.
If you really must try to decode regardless, you can always try to grab 
one container type from the decoder, and if it fails, attempt to grab 
the other container type.



* * *

I went ahead and implemented a basic version of `Encoder` and 
`Encodable` in a Swift 3 playground, just to get a f

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-04-03 Thread Itai Ferber via swift-evolution
Hi everyone,

With feedback from swift-evolution and additional internal review, we've pushed 
updates to this proposal, and to the Swift Encoders 
 proposal.
In the interest of not blowing up mail clients with the full HTML again, I'll 
simply be linking to the swift-evolution PR here 
, as well as the specific 
diff 

 of what's changed.

At a high level:

The Codable protocol has been split up into Encodable and Decodable
String keys on CodingKey are no longer optional
KeyedEncodingContainer has become KeyedEncodingContainerProtocol, with a 
concrete type-erased KeyedEncodingContainer struct to hold it
Array responsibilities have been removed from KeyedEncodingContainer, and have 
been added to a new UnkeyedEncodingContainer type
codingKeyContext has been renamed codingPath
There are some specific changes inline — I know it might be a bit of a pain, 
but let's keep discussion here on the mailing list instead of on GitHub.
We'll be looking to start the official review process very soon, so we're 
interested in any additional feedback.

Thanks!

— Itai___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-23 Thread Itai Ferber via swift-evolution

Hi Oliver,

Thanks for your comments! We thought about this and we agree overall — 
we will incorporate this suggestion along with others in the next batch 
update as long as nothing prohibitive comes up.


— Itai

On 23 Mar 2017, at 7:49, Oliver Jones wrote:


Like everyone I’m excited by this new proposal.  But…

protocol Codable: Adopted by types to opt into archival. Conformance 
may be automatically derived in cases where all properties are also 
Codable.


… can I make one suggestion.  Please do not repeat the mistakes of 
NSCoding in combining the encoding and decoding into a single 
protocol.  Just as there are Encoder and Decoder classes their should 
be Encodable and Decodable protocols (maybe have an aggregate Codable 
protocol for convenience but do not force it).


My reasoning:

Sometimes you only want to decode or encode and object and not vice 
versa.  This is often the case with Web APIs and JSON serialisation.


Eg:

Often an app only consumes (decodes) JSON encoded objects and never 
writes them out (a read only app for example). So the encode(to:) 
methods are completely redundant and someone adopting Codable should 
not be forced to write them.


If only I had a dollar for all the times I’ve seen this sort of code 
in projects:


class MyClass : NSCoding {
init?(coder: NSCoder) {
  // ... some decoding code
}

func encode(with aCoder: NSCoder) {
   preconditionFailure(“Not implemented”)
}
}


Another example:

Web APIs often take data in a different structure as input (i.e. 
“Request” objects) than they output.  These request objects are 
only ever encoded and never decoded by an application so implementing 
init(from:) is completely redundant.


Personally I think the approach taken by libraries like Wrap 
(https://github.com/johnsundell/wrap) and Unbox 
(https://github.com/JohnSundell/Unbox) is a much better design.  
Encoding and decoding should not be the same protocol.


Yes I understand that Codable could provide no-op (or 
preconditionFailure) protocol extension based default implementations 
of init(from:) and encode(to:) (or try to magic up implementations 
based on the Codable nature of public properties as suggested in the 
proposal) but to me that seems like a hack that is papering over bad 
design.  I think this joint Codable design probably fails the Liskov 
substitution principle too.


So I again implore you to consider splitting Codable into two 
protocols, one for encoding and another for decoding.


Sorry if I’m repeating what other people have already said.  I’ve 
not read every response to this proposal on the list.


Regards



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-22 Thread Itai Ferber via swift-evolution

Hi Ben,

What’s the use case that you were thinking of? `KeyPath`s could be 
useful in the case where you don’t need to customize your key names, 
but cannot represent a custom case like


```swift
public struct Post {
var authorID: Int
var bodyText: String

private enum CodingKeys : String, CodingKey {
case authorID = "author_id"
case bodyText = "body_text"
}
}
```

Or am I misunderstanding?

— Itai

On 22 Mar 2017, at 5:39, Ben Rimmington wrote:


On 15 Mar 2017, at 22:40, Itai Ferber wrote:

The following introduces a new Swift-focused archival and 
serialization API as part of the Foundation framework. We’re 
interested in improving the experience and safety of performing 
archival and serialization, and are happy to receive community 
feedback on this work.


Instead of a CodingKeys enum, could the KeyPath proposal 
 be utilized 
somehow?


-- Ben
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-21 Thread Itai Ferber via swift-evolution

Hi Colin,

Thanks for your comments! Are you talking about `Codable` synthesis, or 
encoding in general?


On 21 Mar 2017, at 8:44, Colin Barrett wrote:


Hi Itai,

Glad to see these proposal! I'm curious, have you or the other Swift 
folks
thought about how *users* of these new Codable protocols will interact 
with

resilience domains?

What I mean is that what appear to be private or internal identifiers, 
and
thus changeable at will, may actually be fragile in that changing them 
will

break the ability to decode archives encoded by previous versions.

Making this safer could mean:
- Encoding only public properties
Unfortunately, property accessibility in code does not always map 1-to-1 
with accessibility for archival (nor do I think they should be tied to 
one another).
There are certainly cases where you’d want to include private 
information in an archive, but that is not useful to expose to external 
clients, e.g., a struct/class version:


```swift
public struct MyFoo {
// Should be encoded.
public var title: String
public var identifier: Int

// This should be encoded too — in case the struct changes in the
// future, want to be able to refer to the payload version.
private let version = 1.0
}
```

Of course, there can also be public properties that you don’t find 
useful to encode. At the moment, I’m not sure there’s a much better 
answer than "the author of the code will have to think about the 
representation of their data"; even if there were an easier way to 
annotate "I definitely want this to be archived"/"I definitely don’t 
want this to be archived", the annotation would still need to be manual.


(The above applies primarily in the case of `Codable` synthesis; when 
implementing `Codable` manually I don’t think the compiler should ever 
prevent you from doing what you need.)



- Adding some form of indirection (a la ObjC non-fragile ivars?)

What do you mean by this?


- Compiler warning (or disallowing) changes to properties in certain
situations.
We’ve thought about this with regards to identifying classes uniquely 
across renaming, moving modules, etc.; this is a resilience problem in 
general.
In order for the compiler to know about changes to your code it’d need 
to keep state across compilations. While possible, this feels pretty 
fragile (and potentially not very portable).


* Compiler warns about changing a property? Blow away the cache 
directory!
* Cloning the code to a new machine for the first time? Hmm, all the 
warnings went away…


This would be nice to have, but yes:


I imagine the specifics would need to follow the rest of the plans for
resilience.
specifics on this would likely be in line with the rest of resilience 
plans for Swift in general.


It's likely that this could be addressed by a future proposal, as for 
the

time being developers can simply "not hold it wrong" ;)

Thanks,
-Colin

On Wed, Mar 15, 2017 at 6:52 PM Itai Ferber via swift-evolution <
swift-evolution@swift.org> wrote:


Hi everyone,

The following introduces a new Swift-focused archival and 
serialization
API as part of the Foundation framework. We’re interested in 
improving the
experience and safety of performing archival and serialization, and 
are

happy to receive community feedback on this work.
Because of the length of this proposal, the *Appendix* and 
*Alternatives
Considered* sections have been omitted here, but are available in the 
full

proposal <https://github.com/apple/swift-evolution/pull/639> on the
swift-evolution repo. The full proposal also includes an *Unabridged 
API* for

further consideration.

Without further ado, inlined below.

— Itai

Swift Archival & Serialization

   - Proposal: SE- 
<https://github.com/apple/swift-evolution/pull/639>
   - Author(s): Itai Ferber <https://github.com/itaiferber>, Michael 
LeHew
   <https://github.com/mlehew>, Tony Parker 
<https://github.com/parkera>

   - Review Manager: TBD
   - Status: *Awaiting review*
   - Associated PRs:
  - #8124 <https://github.com/apple/swift/pull/8124>
  - #8125 <https://github.com/apple/swift/pull/8125>

Introduction

Foundation's current archival and serialization APIs (NSCoding,
NSJSONSerialization, NSPropertyListSerialization, etc.), while 
fitting
for the dynamism of Objective-C, do not always map optimally into 
Swift.

This document lays out the design of an updated API that improves the
developer experience of performing archival and serialization in 
Swift.


Specifically:

   - It aims to provide a solution for the archival of Swift struct 
and

   enum types
   - It aims to provide a more type-safe solution for serializing to
   external formats, such as JSON and plist

Motivation

The primary motivation for this proposal is the inclusion of native 
Swift

enum and struct types in archival and serialization. Currently,
developers targeting Swift c

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-20 Thread Itai Ferber via swift-evolution
I don’t think there’s much of a difference between adding an 
"optional" primitive (which has a default implementation in terms of a 
different primitive) and simply having that type adopt `Codable` itself 
and not be a primitive. You can still switch on the type dynamically 
(and we do), but you don’t need the optional overload for it.


On 19 Mar 2017, at 19:33, Jonathan Hull wrote:

> On Mar 17, 2017, at 1:23 PM, Brent Royal-Gordon via swift-evolution 
 wrote:


(Also, is there any sense in adding `Date` to this set, since it 
needs special treatment in many of our formats?)


We’ve considered adding Date to this list. However, this means 
that any format that is a part of this system needs to be able to 
make a decision about how to format dates. Many binary formats have 
no native representations of dates, so this is not necessarily a 
guarantee that all formats can make.


Looking for additional opinions on this one.

I think that, if you're taking the view that you want to provide a 
set of pre-specified primitive methods as a list of things you want 
encoders to make a policy decision about, Date is a good candidate. 
But as I said earlier, I'd prefer to radically reduce the set of 
primitives, not add to it.


IIUC, two of your three proposed, Foundation-provided coders need to 
do something special with dates; perhaps one of the three needs to do 
something special with different integer sizes and types. Think of 
that as a message about your problem domain.


Have you considered having a very small set of true primitives, and a 
larger set of optional primitives.  For the optional primitives, a 
default implementation would be provided that converts it to/from one 
of the true primitives (e.g. date <—> string), but it would still 
provide an override point for formats that want to support it more 
directly.


Thanks,
Jon



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-17 Thread Itai Ferber via swift-evolution



On 17 Mar 2017, at 13:36, Joe Groff wrote:


> On Mar 17, 2017, at 12:49 PM, Itai Ferber  wrote:


On 17 Mar 2017, at 12:18, Michael Gottesman wrote:


On Mar 16, 2017, at 10:23 AM, Joe Groff via swift-evolution 
 wrote:


On Mar 16, 2017, at 10:21 AM, Itai Ferber  wrote:

On 15 Mar 2017, at 19:12, Joe Groff wrote:


On Mar 15, 2017, at 6:46 PM, Itai Ferber  wrote:

Thanks Joe, and thanks for passing this along!

To those who are curious, we use abstract base classes for a 
cascading list of reasons:


• We need to be able to represent keyed encoding and decoding 
containers as abstract types which are generic on a key type
• There are two ways to support abstraction in this way: protocol & 
type constraints, and generic types
• Since Swift protocols are not generic, we unfortunately cannot 
write protocol KeyedEncodingContainer { ... }, which 
is the "ideal" version of what we're trying to represent

• Let's try this with a protocol first (simplified here):

protocol Container {
associatedtype Key : CodingKey
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {

// return something
}

This looks promising so far — let's try to make it concrete:

struct ConcreteContainer : Container {
typealias Key = K
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {
return ConcreteContainer() // error: Cannot convert return 
expression of type 'ConcreteContainer' to return type 'Cont'

}

Joe or anyone from the Swift team can describe this better, but this 
is my poor-man's explanation of why this happens. Swift's type 
constraints are "directional" in a sense. You can constrain a type 
going into a function, but not out of a function. There is no type I 
could return from inside of container() which would satisfy this 
constraint, because the constraint can only be satisfied by turning 
Cont into a concrete type from the outside.


Okay, well let's try this:

func container... {
return ConcreteContainer() as! Cont
}

This compiles fine! Hmm, let's try to use it:

container(Int.self) // error: Generic parameter 'Cont' could not be 
inferred


The type constraint can only be fulfilled from the outside, not the 
inside. The function call itself has no context for the concrete type 
that this would return, so this is a no-go.


• If we can't do it with type constraints in this way, is it 
possible with generic types? Yep! Generic types satisfy this without 
a problem. However, since we don't have generic protocols, we have to 
use a generic abstract base class to represent the same concept — 
an abstract container generic on the type of key which dynamically 
dispatches to the "real" subclassed type


Hopes that gives some simplified insight into the nature of this 
decision.


I see. Protocols with associated types serve the same purpose as 
generic interfaces in other languages, but we don't have the 
first-class support for protocol types with associated type 
constraints (a value of type `Container where Key == K`). That's 
something we'd like to eventually support. In other places in the 
standard library, we wrtie the type-erased container by hand, which 
is why we have `AnySequence`, `AnyCollection`, and `AnyHashable`. You 
could probably do something similar here; that would be a bit awkward 
for implementers, but might be easier to migrate forward to where we 
eventually want to be with the language.


-Joe

Yep, that’s a good way to describe it.
We could potentially do that as well, but adding another type like 
AnyHashable or AnyCollection felt like a much more sweeping change, 
considering that those require some special compiler magic themselves 
(and we’d like to do as little of that as we can).


AnyCollection doesn't have any special compiler magic. AnyHashable's 
only magic is that it has implicit conversions, but that would become 
normal behavior once it can be replaced by a plain Hashable 
existential type.


Hey Itai. I am not sure if I missed this. But did you follow up with 
why you didn't want to use AnyCollection/AnyHashable? The thread got 
really long pretty fast.


I responded to this in a different part of the thread very recently. 
Can you elaborate on how a type like AnyCollection/AnyHashable would 
help here? More important than the type erasure is the type being 
generic on the key type, and this must be specified. How would this 
be possible





You can implement an AnyContainer type that conforms to the 
Container protocol, using an abstract base class that erases the type, 
and a private subclass that forwards the interface to a contained 
value. 
https://www.bignerdranch.com/blog/breaking-down-type-erasure-in-swift/ 
 
runs the technique down fairly well. I don't see any obvious reason we 
couldn't do that here. The tradeoff I see is whether the inconvenience 
of manually wrapping conforming types in AnyContainer outweighs
the constraint that container implementations must be su

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-17 Thread Itai Ferber via swift-evolution



On 17 Mar 2017, at 12:18, Michael Gottesman wrote:

On Mar 16, 2017, at 10:23 AM, Joe Groff via swift-evolution 
 wrote:




On Mar 16, 2017, at 10:21 AM, Itai Ferber  wrote:

On 15 Mar 2017, at 19:12, Joe Groff wrote:


On Mar 15, 2017, at 6:46 PM, Itai Ferber  wrote:

Thanks Joe, and thanks for passing this along!

To those who are curious, we use abstract base classes for a 
cascading list of reasons:


• We need to be able to represent keyed encoding and decoding 
containers as abstract types which are generic on a key type
• There are two ways to support abstraction in this way: protocol 
& type constraints, and generic types
• Since Swift protocols are not generic, we unfortunately cannot 
write protocol KeyedEncodingContainer { ... }, 
which is the "ideal" version of what we're trying to represent

• Let's try this with a protocol first (simplified here):

protocol Container {
associatedtype Key : CodingKey
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {

// return something
}

This looks promising so far — let's try to make it concrete:

struct ConcreteContainer : Container {
typealias Key = K
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {
return ConcreteContainer() // error: Cannot convert return 
expression of type 'ConcreteContainer' to return type 'Cont'

}

Joe or anyone from the Swift team can describe this better, but this 
is my poor-man's explanation of why this happens. Swift's type 
constraints are "directional" in a sense. You can constrain a type 
going into a function, but not out of a function. There is no type I 
could return from inside of container() which would satisfy this 
constraint, because the constraint can only be satisfied by turning 
Cont into a concrete type from the outside.


Okay, well let's try this:

func container... {
return ConcreteContainer() as! Cont
}

This compiles fine! Hmm, let's try to use it:

container(Int.self) // error: Generic parameter 'Cont' could not be 
inferred


The type constraint can only be fulfilled from the outside, not the 
inside. The function call itself has no context for the concrete 
type that this would return, so this is a no-go.


• If we can't do it with type constraints in this way, is it 
possible with generic types? Yep! Generic types satisfy this without 
a problem. However, since we don't have generic protocols, we have 
to use a generic abstract base class to represent the same concept 
— an abstract container generic on the type of key which 
dynamically dispatches to the "real" subclassed type


Hopes that gives some simplified insight into the nature of this 
decision.


I see. Protocols with associated types serve the same purpose as 
generic interfaces in other languages, but we don't have the 
first-class support for protocol types with associated type 
constraints (a value of type `Container where Key == K`). That's 
something we'd like to eventually support. In other places in the 
standard library, we wrtie the type-erased container by hand, which 
is why we have `AnySequence`, `AnyCollection`, and `AnyHashable`. 
You could probably do something similar here; that would be a bit 
awkward for implementers, but might be easier to migrate forward to 
where we eventually want to be with the language.


-Joe

Yep, that’s a good way to describe it.
We could potentially do that as well, but adding another type like 
AnyHashable or AnyCollection felt like a much more sweeping change, 
considering that those require some special compiler magic 
themselves (and we’d like to do as little of that as we can).


AnyCollection doesn't have any special compiler magic. AnyHashable's 
only magic is that it has implicit conversions, but that would become 
normal behavior once it can be replaced by a plain Hashable 
existential type.


Hey Itai. I am not sure if I missed this. But did you follow up with 
why you didn't want to use AnyCollection/AnyHashable? The thread got 
really long pretty fast.
I responded to this in a different part of the thread very recently. Can 
you elaborate on how a type like `AnyCollection`/`AnyHashable` would 
help here? More important than the type erasure is the type being 
generic on the key type, and this must be specified. How would this be 
possible?



Michael



-Joe
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-17 Thread Itai Ferber via swift-evolution

Do you mean versions of the format, or versions of your type?

If the latter, this can be done on a case-by-case basis, as needed. You 
can always do something like


```swift
struct Foo : Codable {
// Name this as appropriate
private let jsonVersion = 1.1
}
```

and have it encode as well.

On 17 Mar 2017, at 11:51, T.J. Usiyan wrote:


Is there any sense of encoding versions (as in, changes to the JSON
representation, for instance?) I don't know that it is necessarily a 
good

idea overall but now is the time to consider it.

On Fri, Mar 17, 2017 at 2:27 PM, Matthew Johnson via swift-evolution <
swift-evolution@swift.org> wrote:



On Mar 17, 2017, at 1:15 PM, Itai Ferber via swift-evolution <
swift-evolution@swift.org> wrote:

On 15 Mar 2017, at 22:58, Zach Waldowski wrote:

Another issue of scale - I had to switch to a native mail client as
replying inline severely broke my webmail client. ;-)

Again, lots of love here. Responses inline.

On Mar 15, 2017, at 6:40 PM, Itai Ferber via swift-evolution <
swift-evolution@swift.org> wrote:
Proposed solution
We will be introducing the following new types:

protocol Codable: Adopted by types to opt into archival. Conformance 
may
be automatically derived in cases where all properties are also 
Codable.


FWIW I think this is acceptable compromise. If the happy path is 
derived
conformances, only-decodable or only-encodable types feel like a lazy 
way

out on the part of a user of the API, and builds a barrier to proper
testing.

[snip]

Structured types (i.e. types which encode as a collection of 
properties)

encode and decode their properties in a keyed manner. Keys may be
String-convertible or Int-convertible (or both), and user types which 
have

properties should declare semantic key enums which map keys to their
properties. Keys must conform to the CodingKey protocol:
public protocol CodingKey { <##snip##> }

A few things here:

The protocol leaves open the possibility of having both a String or 
Int
representation, or neither. What should a coder do in either case? 
Are the
representations intended to be mutually exclusive, or not? The 
protocol
design doesn’t seem particularly matching with the flavor of Swift; 
I’d

expect something along the lines of a CodingKey enum and the protocol
CodingKeyRepresentable. It’s also possible that the concerns of the 
two are

orthogonal enough that they deserve separate container(keyedBy:)
requirements.

The general answer to "what should a coder do" is "what is 
appropriate for
its format". For a format that uses exclusively string keys (like 
JSON),
the string representation (if present on a key) will always be used. 
If the
key has no string representation but does have an integer 
representation,
the encoder may choose to stringify the integer. If the key has 
neither, it

is appropriate for the Encoder to fail in some way.

On the flip side, for totally flat formats, an Encoder may choose to
ignore keys altogether, in which case it doesn’t really matter. The 
choice

is up to the Encoder and its format.

The string and integer representations are not meant to be mutually
exclusive at all, and in fact, where relevant, we encourage providing 
both

types of representations for flexibility.

As for the possibility of having neither representation, this 
question
comes up often. I’d like to summarize the thought process here by 
quoting
some earlier review (apologies for the poor formatting from my mail 
client):


If there are two options, each of which is itself optional, we have 4
possible combinations. But! At the same time we prohibit one 
combination by
what? Runtime error? Why not use a 3-case enum for it? Even further 
down
the rabbit whole there might be a CodingKey<> specialized for a 
concrete
combination, like CodingKey or just 
CodingKey,
but I’m not sure whether our type system will make it useful or 
possible…


public enum CodingKeyValue {
case integer(value: Int)
case string(value: String)
case both(intValue: Int, stringValue: String)
}
public protocol CodingKey {
init?(value: CodingKeyValue)
var value: CodingKeyValue { get }
}

I agree that this certainly feels suboptimal. We’ve certainly 
explored
other possibilities before sticking to this one, so let me try to 
summarize

here:

* Having a concrete 3-case CodingKey enum would preclude the 
possibility
of having neither a stringValue nor an intValue. However, there is a 
lot of

value in having the key types belong to the type being encoded (more
safety, impossible to accidentally mix key types, private keys, 
etc.); if
the CodingKey type itself is an enum (which cannot be inherited 
from), then

this prevents differing key types.
* Your solution as presented is better: CodingKey itself is still a
protocol, and the value itself is the 3-case enum. However, since
CodingKeyValue is not literal-representable, user keys cannot be 
enums
RawRepresentable by CodingKeyValue. That means t

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-17 Thread Itai Ferber via swift-evolution

On 16 Mar 2017, at 14:29, Matthew Johnson wrote:

This is a fantastic proposal!  I am very much looking forward to 
robust Swift-native encoding and decoding in Foundation.  The compiler 
synthesized conformances is especially great!I want to thank 
everyone who worked on it.  It is clear that a lot of work went into 
the proposal.


The proposal covers a lot of ground so I’m breaking my comments up 
by topic in the order the occur in the proposal.

Thanks for the feedback, Matthew! Responses inline.


Encode / Decode only types:

Brent raised the question of decode only types.  Encode only types are 
also not uncommon when an API accepts an argument payload that gets 
serialized into the body of a request. The compiler synthesis feature 
in the proposal makes providing both encoding and decoding easy in 
common cases but this won’t always work as needed.


The obvious alternative is to have Decodable and Encodable protocols 
which Codable refines.  This would allow us to omit a conformance we 
don’t need when it can’t be synthesized.
If conformances are still synthesized individually (i.e. for just 
`Decodable` or just `Encodable`), it would be way too easy to 
accidentally conform to one or the other and not realize that you’re 
not conforming to `Codable`, since the synthesis is invisible. You’d 
just be missing half of the protocol.


If the way out of this is to only synthesize conformance to `Codable`, 
then it’s much harder to justify the inclusion of `Encodable` or 
`Decodable` since those would require a manual implementation and would 
much more rarely be used.


Your reply to Brent mentions using `fatalError` to avoid implementing 
the direction that isn't needed.  I think it would be better if the 
conformance can reflect what is actually supported by the type.  
Requiring us to write `fatalError` as a stub for functionality we 
don’t need is a design problem IMO.  I don’t think the extra 
protocols are really that big a burden.  They don’t add any new 
functionality and are very easy to understand, especially considering 
the symmetry they would have with the other types you are introducing.


Coding Keys:

As others have mentioned, the design of this protocol does not require 
a value of a conforming type to actually be a valid key (it can return 
nil for both `intValue` and `stringValue`).  This seems problematic to 
me.


In the reply to Brent again you mention throwing and 
`preconditionFailure` as a way to handle incompatible keys.  This also 
seems problematic to me and feels like a design problem. If we really 
need to support more than one underlying key type and some encoders 
will reject some key types this information should be captured in the 
type system.  An encoder would only vend a keyed container for keys it 
actually supports.  Ideally the conformance of a type’s CodingKeys 
could be leveraged to produce a compiler error if an attempt was made 
to encode this type into an encoder that can’t support its keys.  In 
general, the idea is to produce static errors as close to the origin 
of the programming mistake as possible.


I would very much prefer that we don’t defer to runtime assertions 
or thrown errors, etc for conditions that could be caught statically 
at compile time given the right design.  Other comments have indicated 
that static guarantees are important to the design (encoders *must* 
guarantee support of primitives specified by the protocols, etc).  Why 
is a static guarantee of compatible coding keys considered less 
important?
I agree that it would be nice to support this in a static way, but while 
not impossible to represent in the type system, it absolutely explodes 
the API into a ton of different types and protocols which are not 
dissimilar. We’ve considered this in the past (see approach #4 in the 
[Alternatives 
Considered](https://github.com/itaiferber/swift-evolution/blob/637532e2abcbdb9861e424359bb6dac99dc6b638/proposals/-swift-archival-serialization.md#alternatives-considered) 
section) and moved away from it for a reason.


To summarize:

* To statically represent the difference between an encoder which 
supports string keys and one which supports integer keys, we would have 
to introduce two different protocol types (say, `StringKeyEncoder` and 
`IntKeyEncoder`)
* Now that there are two different encoder types, the `Codable` protocol 
also needs to be split up into two — one version which encodes using a 
`StringKeyEncoder` and one version which encodes using an 
`IntKeyEncoder`. If you want to support encoding to an encoder which 
supports both types of keys, we’d need a _third_ Codable protocol 
which takes something that’s `StringKeyEncoder & IntKeyEncoder` 
(because you cannot just conform to both `StringCodable` and 
`IntCodable` — it’s ambiguous when given something that’s 
`StringKeyEncoder & IntKeyEncoder`)
* On encoders which support both string and integer keys, you need 
overloads for `encode(…)`, `encodeIntCodable>(…)`, and `e

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-17 Thread Itai Ferber via swift-evolution



On 15 Mar 2017, at 22:58, Zach Waldowski wrote:

Another issue of scale - I had to switch to a native mail client as 
replying inline severely broke my webmail client. ;-)


Again, lots of love here. Responses inline.

On Mar 15, 2017, at 6:40 PM, Itai Ferber via swift-evolution 
 wrote:

Proposed solution
We will be introducing the following new types:

protocol Codable: Adopted by types to opt into archival. Conformance 
may be automatically derived in cases where all properties are also 
Codable.
FWIW I think this is acceptable compromise. If the happy path is 
derived conformances, only-decodable or only-encodable types feel like 
a lazy way out on the part of a user of the API, and builds a barrier 
to proper testing.

[snip]

Structured types (i.e. types which encode as a collection of 
properties) encode and decode their properties in a keyed manner. 
Keys may be String-convertible or Int-convertible (or both), and user 
types which have properties should declare semantic key enums which 
map keys to their properties. Keys must conform to the CodingKey 
protocol:

public protocol CodingKey { <##snip##> }


A few things here:

The protocol leaves open the possibility of having both a String or 
Int representation, or neither. What should a coder do in either case? 
Are the representations intended to be mutually exclusive, or not? The 
protocol design doesn’t seem particularly matching with the flavor 
of Swift; I’d expect something along the lines of a CodingKey enum 
and the protocol CodingKeyRepresentable. It’s also possible that the 
concerns of the two are orthogonal enough that they deserve separate 
container(keyedBy:) requirements.
The general answer to "what should a coder do" is "what is appropriate 
for its format". For a format that uses exclusively string keys (like 
JSON), the string representation (if present on a key) will always be 
used. If the key has no string representation but does have an integer 
representation, the encoder may choose to stringify the integer. If the 
key has neither, it is appropriate for the `Encoder` to fail in some 
way.


On the flip side, for totally flat formats, an `Encoder` may choose to 
ignore keys altogether, in which case it doesn’t really matter. The 
choice is up to the `Encoder` and its format.


The string and integer representations are not meant to be mutually 
exclusive at all, and in fact, where relevant, we encourage providing 
both types of representations for flexibility.


As for the possibility of having neither representation, this question 
comes up often. I’d like to summarize the thought process here by 
quoting some earlier review (apologies for the poor formatting from my 
mail client):


If there are two options, each of which is itself optional, we have 4 
possible combinations. But! At the same time we prohibit one 
combination by what? Runtime error? Why not use a 3-case enum for it? 
Even further down the rabbit whole there might be a CodingKey<> 
specialized for a concrete combination, like 
CodingKey or just CodingKey, but I’m 
not sure whether our type system will make it useful or possible…


public enum CodingKeyValue {
  case integer(value: Int)
  case string(value: String)
  case both(intValue: Int, stringValue: String)
}
public protocol CodingKey {
  init?(value: CodingKeyValue)
  var value: CodingKeyValue { get }
}
I agree that this certainly feels suboptimal. We’ve certainly 
explored other possibilities before sticking to this one, so let me 
try to summarize here:


* Having a concrete 3-case CodingKey enum would preclude the 
possibility of having neither a stringValue nor an intValue. However, 
there is a lot of value in having the key types belong to the type 
being encoded (more safety, impossible to accidentally mix key types, 
private keys, etc.); if the CodingKey type itself is an enum (which 
cannot be inherited from), then this prevents differing key types.
* Your solution as presented is better: CodingKey itself is still a 
protocol, and the value itself is the 3-case enum. However, since 
CodingKeyValue is not literal-representable, user keys cannot be enums 
RawRepresentable by CodingKeyValue. That means that the values must 
either be dynamically returned, or (for attaining the benefits that we 
want to give users — easy representation, autocompletion, etc.) the 
type has to be a struct with static lets on it giving the 
CodingKeyValues. This certainly works, but is likely not what a 
developer would have in mind when working with the API; the power of 
enums in Swift makes them very easy to reach for, and I’m thinking 
most users would expect their keys to be enums. We’d like to 
leverage that where we can, especially since RawRepresentable enums 
are appropriate in the vast majority of use cases.
* Three separate CodingKey protocols (one for Strings, one for Ints, 
and one for both). You could argue that this is the most correct 
version, since it most clearly r

Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution



On 16 Mar 2017, at 15:45, Matthew Johnson wrote:


> On Mar 16, 2017, at 5:37 PM, Itai Ferber  wrote:


On 16 Mar 2017, at 14:48, Matthew Johnson wrote:

Thank you again for bringing these great proposals forward!

Thanks for reviewing it, and for your comments!

I only have a couple of questions about this proposal.

I noticed that the types in this proposal don’t conform to Encoder 
and Decoder. Is the plan to have them to provide private conforming 
types to Codable types they are asked to encode or decode?


Yes. This is because the top-level interface for encoding and 
decoding in JSON and plist is different from the intermediate 
interface that Encoder and Decoder offer. As such, the top-level 
types don’t conform to Encoder and Decoder, but vend out internal 
types which do.


This makes sense.  I was initially concerned about the meaning of 
mutating these values during encoding or decoding but it looks like 
that isn’t possible without some really nefarious code that passes a 
reference to the top-level encoder / decoder to an object that is 
getting encoded / decoded.  What will you do if somebody actually does 
this?
The options are copied immutably into the internal types and the 
originals are not consulted during the process of encoding or decoding 
— we want to prevent exactly this.
FWIW, you can see the current implementation of this on [the 
implementation PR](https://github.com/apple/swift/pull/8124).


Why are the strategy and format properties readwrite instead of 
configured at initialization time? Is the intent that the encoder / 
decoder can be re-used with a different configuration in a subsequent 
call to encode or decode?


Yes. It’s also a mouthful to have them all as params in the 
constructor, especially if we add more options in the future.


Taking them in an initializer would not need to be wordy - they could 
all specify default arguments.

Sure, but if you want to specify a lot of them…
But, this is more of a stylistic argument. Six of one, half-dozen of 
another. The more useful thing is supporting mutation after 
initialization, which is a reasonable thing to want to do.


Finally, I agree with Brent’s comments regarding errors. I would 
prefer to see Foundation move away from NSError in favor of 
domain-specific error types. That said, the comment that this is a 
broader discussion for Foundation and not something to change in this 
proposal is reasonable. I hope Foundation will consider changing this 
in the future.


Thanks for your understanding — we will keep these concerns in 
mind.


Matthew




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution
The sorted order of keys is an implementation detail of 
`PropertyListSerialization` which `JSONSerialization` does not share.
This would require a change in `JSONSerialization`, which is possible, 
but out of scope for this work specifically. It would have to go through 
review.


[FWIW, the key ordering isn’t changing at random; `JSONSerialization` 
writes keys in the order that it receives them (sorted in the order that 
`NSDictionary` stores them, based on hash). If you want this to not 
interfere with your Git repo (on a more immediate timescale), you can 
probably do something like adding a pre-commit Git hook to lint the file 
as needed. But I digress, this is totally off-topic.]


On 16 Mar 2017, at 14:57, Ben Rimmington wrote:

Sorry, I actually meant that keys would be ordered **alphabetically** 
(as they are with PropertyListSerialization).


I'm storing output from JSONSerialization in a git repository, and 
when I add/remove a single key-value the entire file changes, due to 
keys being reordered (seemingly at random).


-- Ben


Itai Ferber wrote:

By stable ordering, you mean dictionaries with keys being output in 
the order that they were encoded?
Potentially, but this would require additional work as neither Swift 
dictionaries nor NSMutableDictionary support this (reasonable seeing 
as dictionaries are by definition not sorted), and the current 
implementation uses JSONSerialization as the concrete serializer. 
Certainly not impossible, though.



On 16 Mar 2017, at 8:49, Ben Rimmington wrote:



Could the `JSONEncoder.OutputFormatting.prettyPrinted` case include 
the **stable ordering** of keys?


-- Ben
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution

On 16 Mar 2017, at 14:48, Matthew Johnson wrote:


Thank you again for bringing these great proposals forward!

Thanks for reviewing it, and for your comments!


I only have a couple of questions about this proposal.

I noticed that the types in this proposal don’t conform to Encoder 
and Decoder.  Is the plan to have them to provide private conforming 
types to Codable types they are asked to encode or decode?
Yes. This is because the top-level interface for encoding and decoding 
in JSON and plist is different from the intermediate interface that 
`Encoder` and `Decoder` offer. As such, the top-level types don’t 
conform to `Encoder` and `Decoder`, but vend out internal types which 
do.


Why are the strategy and format properties readwrite instead of 
configured at initialization time?  Is the intent that the encoder / 
decoder can be re-used with a different configuration in a subsequent 
call to encode or decode?
Yes. It’s also a mouthful to have them all as params in the 
constructor, especially if we add more options in the future.


Finally, I agree with Brent’s comments regarding errors.  I would 
prefer to see Foundation move away from NSError in favor of 
domain-specific error types.  That said, the comment that this is a 
broader discussion for Foundation and not something to change in this 
proposal is reasonable.  I hope Foundation will consider changing this 
in the future.

Thanks for your understanding — we will keep these concerns in mind.


Matthew
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution



On 16 Mar 2017, at 13:27, David Hart wrote:

> On 16 Mar 2017, at 20:55, Itai Ferber via swift-evolution 
 wrote:


I’m going to reply to this thread as a whole — apologies if 
there’s someone’s comment that I’ve missed.


This is something that has come up in internal review, and we’ve 
certainly given it thought. As Zach has already mentioned, the 
primary concern with overloading based on return type is ambiguity.
There are many cases in which Swift’s type system currently does 
not handle ambiguity in the way that you would expect, and it can be 
very surprising. For instance,


func foo() -> Int { return 42 }
func foo() -> Double { return .pi }
func consumesInt(_ x : Int) { print(x) }

let x = foo() // Ambiguous use of foo()
consumesInt(x) // Even though x is going to be used as an Int
let y: Int = x // Same here
let x = foo() as Int works now, but it actually didn’t always — 
until a somewhat recent version of Swift AFAICT, the only way to 
resolve the ambiguity was through let x: Int = foo(). This has since 
been fixed, but it was very confusing to try to figure out the 
unambiguous way to call it.


Keep in mind that this isn’t an unreasonable thing to want to do:

struct Foo {
var x: Int
init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: 
CodingKeys.self)


// Want to process an element before it’s assigned.
let x = container.decode(forKey: .x) // Ambiguous call

// Or whatever.
if x < 0 {
self.x = x + 100
else {
self.x = x * 200
}
}
}
You can write let x: Int = container.decode(…) or let x = 
container.decode(…) as Int, but this isn’t always intuitive.



That’s where I disagree. Let me try to prove my point:

You bring up the example of having to store the decoded value in a 
variable before setting it to a typed property. But its also not 
unreasonable to want to do the same thing when encoding the value, 
possibly storing it into a different type. If we follow that argument, 
its also not very intuitive to have to do


container.encode(x as Double, forKey: .x).

Wouldn’t that be an argument to have an API like this:

func encode(_ value: Data?, forKey key: Key, as type: T.Type) 
throws

I don’t agree that these are equivalent cases.

Here, for an `as` cast to be valid, the type of `x` must be an 
existential (I’m guessing `Any`).
The original `container.encode(x, forKey: .x)` call is not ambiguous 
because `x` has no type, but rather because the type of `x` does not 
match any of the overloads. You would get the same error as if you wrote


```swift
struct NonCodableFoo {}
let x = NonCodableFoo()
container.encode(x, forKey: .x)
```

You have to convert the type to something that fits one of the 
overloads.


On encode, there cannot be any true ambiguity because it’s not 
possible to satisfy more than one of these concrete overloads. You 
cannot have a thing with a type which would satisfy both, say, `Int` and 
`Double`.


I would argue that type inference is a core feature in Swift and that 
we should embrace it. I believe that in most cases the return value of 
encode will be stored into a typed property and type inference will do 
the right thing. In the few cases where the type has to be enforced, 
the patterns you mention above are not weird syntax; they are used and 
useful all over Swift:

Sure, but I think these cases are not equivalent.


let cgFloat: CGFloat = 42
42 has a default value of `Int`, but since `CGFloat` is 
`ExpressibleByIntLiteral`, this becomes the equivalent of writing `let 
cfFloat = CGFloat(42)`, which would not be ambiguous without the 
`CGFloat()`; you would just get an `Int`.


With `let x = container.decode(forKey: .x)`, has _has no type_ unless 
otherwise specified.



let pi = 3.14159265359 as Float
Same here, but with `Double` instead of `Int`, and `Float` instead of 
`CGFloat`…


let person = factory.get() // potential feature in Generics 
Manifesto
This isn’t type inference. This is type specification, which is 
exactly what we are trying to do. At the moment, explicit type 
specification has a different syntax: passing a metatype as an argument.


If this feature were available, this is what we would use.

The way I think about it is that the type argument is already there as 
a generic parameter. Adding an extra argument that needs to be 
explicitly given on every single call feels like unneeded verbosity to 
me.
For consideration: why does `let person = factory.get()` seem 
reasonable, but `let person = factory.get(Person.self)` does not?


Consider also that the metatype would also be necessary for 
decode(_ type: Value.Type, forKey: Key) -> Value 
because the return value of that certainly could be ambiguous in many 
cases.


Finally, the metatype arg allows you to express the following 
succinctly: let v: SuperClass = container.decode(SubClass.self, 
forKey: .v).


In the general case (decode) we would need t

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution
If throwing subscripts made it in the Swift 4 timeframe, then we would 
certainly consider it.


On 16 Mar 2017, at 13:19, Matthew Johnson wrote:


> On Mar 16, 2017, at 3:01 PM, Itai Ferber  wrote:


Subscripts, by the way, would not help here, since they cannot throw. 
decode must be able to throw.
SR-238 
; 
for Apple folks, 28775436.


They don’t “help” but they do provide a more natural interface.  
If the Foundation team feels a more wordy interface is necessary that 
is ok.


I specifically mentioned that they can’t throw yet.  Throwing 
subscripts would make a good companion proposal if they could fit into 
the Swift 4 timeframe.  If not, then yes we need a method rather than 
a subscript.  But if we can get throwing subscripts into Swift 4, why 
not use Swift’s first class syntactic support for keyed access to 
keyed containers?



On 16 Mar 2017, at 11:46, Matthew Johnson via swift-evolution wrote:



On Mar 16, 2017, at 1:34 PM, Zach Waldowski via swift-evolution 
mailto:swift-evolution@swift.org>> 
wrote:


On Thu, Mar 16, 2017, at 02:23 PM, Matthew Johnson via 
swift-evolution wrote:
I don’t have an example but I don’t see a problem either.  
There are two options for specifying the return type manually.  We 
can use the signature you used above and use `as` to specify the 
expected type:


let i = decode(.myKey) as Int


The awkwardness of this syntax is exactly what I'm referring to. 
Would a beginner know to use "as Int" or ": Int"? Why would they? 
The "prettiness" of the simple case doesn't make up for how 
difficult it is to understand and fix its failure cases.


Any official Swift or Foundation API shouldn't, or shouldn't need 
to, make use of "tricky" syntax.


I don’t think this is especially tricky.  Nevertheless, we can 
avoid requiring this syntax by moving the type argument to the end 
and providing a default.  But I think return type inference is worth 
supporting.  It has become widely adopted by the community already in 
this use case.




If we don’t support this in Foundation we will continue to see 
3rd party libraries that do this.


The proposal's been out for less than 24 hours, is it really 
productive to already be taking our ball and go home over such a 
minor thing?


I don’t think that’s what I’m doing at all.  This is a 
fantastic proposal.  I’m still working through it and writing up my 
more detailed thoughts.


That said, as with many (most?) first drafts, there is room for 
improvement.  I think it’s worth pointing out the syntax that many 
of us would like to use for decoding and at least considering 
including it in the proposal.  If the answer is that it’s trivial 
for those who want to use subscripts to write the wrappers for return 
type inference and / or subscripts themselves that’s ok.  But 
it’s a fair topic for discussion and should at least be addressed 
as an alternative that was rejected for a specific reason.




Zach Waldowski
z...@waldowski.me 




___
swift-evolution mailing list
swift-evolution@swift.org 
https://lists.swift.org/mailman/listinfo/swift-evolution


___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution 




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution
By stable ordering, you mean dictionaries with keys being output in the 
order that they were encoded?
Potentially, but this would require additional work as neither Swift 
dictionaries nor `NSMutableDictionary` support this (reasonable seeing 
as dictionaries are by definition not sorted), and the current 
implementation uses `JSONSerialization` as the concrete serializer. 
Certainly not impossible, though.


On 16 Mar 2017, at 8:49, Ben Rimmington wrote:




Could the `JSONEncoder.OutputFormatting.prettyPrinted` case include 
the **stable ordering** of keys?


-- Ben
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution

Hi Slava,

Thanks for your comments!

On 16 Mar 2017, at 13:50, Slava Pestov wrote:


Hi Itai,

I’m wondering what the motivation is for keeping this as part of 
Foundation and not the standard library. It seems like you’re 
landing an implementation of this in the Foundation overlay on master, 
and another copy of all the code will have to go into 
swift-corelibs-foundation. This seems suboptimal. Or are there future 
plans to unify the Foundation overlay with corelibs-foundation 
somehow?
This has to be part of Foundation because `Data`, a Foundation type, is 
one of the primitive types of serialization. This will be doubly true if 
we decide to add `Date` as another primitive type.


I agree that this is suboptimal at the moment, but we will work to find 
a way to keep the work in sync in a reasonable manner.


Also the implementation uses some Foundation-isms (NSMutableArray, 
NSNumber) and it would be nice to stick with idiomatic Swift as much 
as possible instead.

Using the Foundation framework is just as idiomatic in Swift… ;)
In this specific case, we need collections with reference semantics 
(`NSMutableArray` and `NSMutableDictionary`) and a concrete type-erased 
number box (`NSNumber`); theres’s no reason to reinvent the wheel if 
we already have exactly the tools we need.


The reference implementation at the moment goes through 
`JSONSerialization`, which affects the specifics of its implementation. 
This may change in the future.


Finally you should take a look at the integer protocol work 
(https://github.com/apple/swift-evolution/blob/master/proposals/0104-improved-integers.md 
) 
to replace the repetitive code surrounding primitive types, however I 
don’t know if this has landed in master yet.
As mentioned in other emails, the list of primitive types was carefully 
chosen because we need to have a concrete list of types which consumers 
can rely on being supported, and that `Encoder`s and `Decoder`s know 
they _must_ support.


Specifically:

1. For binary formats, the difference between an `Int16` and an `Int64` 
is significant. The `Encoder` would need to know that it’s received 
one type or another, not just a `FixedWidthInteger`; this would involve 
a runtime check of the concrete type of the argument
2. Any type can conform to these protocols — nothing is preventing me 
from writing an `Int37` type conforming to `FixedWidthInteger` and 
passing it in. Most encoders would really not know what to do with this 
type (especially ones with binary formats), but the failure would be a 
runtime one instead of a static one
  * A concrete example of this is the `FloatingPoint` protocol. 
`Float80` conforms to the protocol, but no common binary format I’ve 
seen supports 80-bit floating-point values. We’d prefer to prevent 
that statically by accepting only `Float` and `Double`
3. Consumers of the API then have no guarantees that a specific 
`Encoder` supports the type that they need. Did the encoder remember to 
support `UInt64` values? Similarly, `Encoder`s and `Decoder`s don’t 
know what types they need to be considering. Am I supposed to handle 
`UInt8` differently from `Int16`? With a list of concrete types, this 
becomes immediately clear — both consumers and writers of `Encoder`s 
have a clear contract.




Slava



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution



On 16 Mar 2017, at 1:00, Brent Royal-Gordon wrote:

On Mar 15, 2017, at 3:43 PM, Itai Ferber via swift-evolution 
 wrote:


Hi everyone,
This is a companion proposal to the Foundation Swift Archival & 
Serialization API. This introduces new encoders and decoders to be 
used as part of this system.

The proposal is available online and inlined below.


Executive summary: I like where you're going with this, but I'm 
worried about flexibility.


I'm not going to quote every bit of the JSON section because Apple 
Mail seems to destroy the formatting when I reply, but: I think you've 
identified several of the most important customization points (Date, 
Data, and illegal Floats). However, I think:


* People may want to map illegal Floats to legal floating-point values 
(say, `greatestFiniteMagnitude`, `-greatestFiniteMagnitude`, and `0`) 
or map them to `null`s. They may also want different behavior for 
different things: imagine `(positiveInfinity: 
Double.greatestFiniteMagnitude, negativeInfinity: 
-Double.greatestFiniteMagnitude, nan: .throw)`.

I agree, this may be something that users could want.



* Large integers are another big concern that you don't address. 
Because JSON only supports doubles, APIs that use 64-bit IDs often 
need them to be passed as strings, frequently with a different key 
("id_str" instead of "id").
This is not true — JSON has no limitations on what numbers it can 
represent. 340282366920938463463374607431768211455 (2^128-1) is a 
perfectly legitimate number in JSON, though you may have a hard reading 
it in on some platforms. _Javascript_ numbers are IEEE 754 doubles, but 
that’s a Javascript problem, not a JSON problem.


If what you mean here is that some large numbers should be encoded as 
strings instead of integers for the benefit of the other side reading it 
in a valid way, then perhaps.


* For that matter, style and capitalization are a problem. JSON style 
varies, but it *tends* to be snake_case, where Cocoa favors camelCase. 
You can address this at the CodingKey level by manually specifying 
string equivalents of all the coding keys, but that's kind of a pain, 
and it affects all of your code and all of your serializations.


I'm sorely tempted to suggest that we give the JSON encoder and 
decoder a delegate:


public protocol JSONCodingDelegate {
		/// Returns the string name to be used when encoding or decoding the 
given CodingKey as JSON.

///
/// - Returns: The string to use, or `nil` for the default.
		func jsonName(for key: CodingKey, at keyPath: [CodingKey], in 
encoderOrDecoder: AnyObject) throws -> String?


// These are used when encoding/decoding any of the integer 
types.
		func jsonValue(from integer: Int64, at keyPath: [CodingKey], in 
encoder: JSONEncoder) throws -> JSONValue?
		func integer(from jsonValue: JSONValue, at keyPath: [CodingKey], in 
decoder: JSONDecoder) throws -> Int64?


		// These are used when encoding/decoding any of the floating-point 
types.
		func jsonValue(from number: Double, at keyPath: [CodingKey], in 
encoder: JSONEncoder) throws -> JSONValue?
		func number(from jsonValue: JSONValue, at keyPath: [CodingKey], in 
decoder: JSONDecoder) throws -> Double?


// These are used when encoding/decoding Date.
		func jsonValue(from date: Date, at keyPath: [CodingKey], in encoder: 
JSONEncoder) throws -> JSONValue?
		func date(from jsonValue: JSONValue, at keyPath: [CodingKey], in 
decoder: JSONDecoder) throws -> Date?


// These are used when encoding/decoding Data.
		func jsonValue(from data: Data, at keyPath: [CodingKey], in encoder: 
JSONEncoder) throws -> JSONValue?
		func data(from jsonValue: JSONValue, at keyPath: [CodingKey], in 
decoder: JSONDecoder) throws -> Data?


		func jsonValue(from double: Double, at keyPath: [CodingKey], in 
encoder: JSONEncoder) throws -> JSONValue?
		func integer(from jsonValue: JSONValue, at keyPath: [CodingKey], in 
decoder: JSONDecoder) throws -> Double?

}
public enum JSONValue {
case string(String)
case number(Double)
case bool(Bool)
case object([String: JSONValue])
case array([JSONValue])
case null
}
I disagree with generalizing this to the point of being on a delegate. 
This is all work that you could be doing in `encode(to:)` and 
`decode(from:)`. In `encode(to:)`, it’s always possible to clamp an 
invalid floating-point number to `Double.greatestFiniteMagnitude`, and 
always possible to `encode("\(id)", forKey: .id)` instead of `encode(id, 
forKey: .id)`.


The options that we have on `JSONEncoder` and `JSONDecoder` straddle a 
fine line between being pedantically correct (and refusing to break 
encapsulation for enco

Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution

We’ll keep this in mind. :)

On 15 Mar 2017, at 19:58, Will Stanton wrote:


Hello Itai,

Thanks for your response and its explanations!

Agreed that comprehension of multiple formats is important since there 
are a couple common ways of encoding JSON dates!


Still, ISO 8601 appears pretty often (though I don’t have data on 
that, Stack Overflow says RFC 7493 I-JSON prefers ISO 8601; 
https://tools.ietf.org/html/rfc7493#section-4.3), and as other servers 
might make/handle a lot of JSON produced to/from the API, I think it 
would be disadvantageous to default to `deferredToDate` (if 
`deferredToDate` doesn't use the ISO 8601 format).


As you mention, writers/readers have to agree on their format - my 2¢ 
is that ISO 8601 would be more common, and so a better default, than a 
Unix or reference date timestamp.


Regards,
Will Stanton


-—Gracias for the prediction :-)




On Mar 15, 2017, at 9:53 PM, Itai Ferber  wrote:

Hi Will,

Thanks for your comments!
deferredToDate simply uses the default implementation that Date 
provides — since it is not a primitive type like Int or String and 
conforms to Codable itself, it will have an implementation of 
init(from:) and encode(to:). It will have an implementation that 
makes sense for Date in general, but since a big use-case for JSON 
lies in talking to external servers which you don't control, allowing 
for custom date formatting is important.


To that end, although ISO 8601 may make sense for some applications 
as the default, it is less efficient to format, encode, decode, and 
parse than, say, writing out a UNIX timestamp as a Double (or 
similar). Thus, the default is to allow Date to pick a representation 
that best suits itself, and if you need customization, you have the 
option to use it.


Since Date makes a concrete decision about how to encode, both sides 
will need to use deferredToDate for compatibility, in the same way 
that they would have to agree about ISO 8601, or any of the other 
options.


HTH!

— Itai

P.S. About Xcode autocompletion slowdown, I don't know to be honest, 
but I can't imagine it would be significant. Only certain types have 
enc... or dec... and even then, the list of methods isn't that long.
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-16 Thread Itai Ferber via swift-evolution
Foundation’s common currency type for errors is unequivocally 
`NSError`, and we’re not looking to change that as part of this 
proposal.
Reusing errors from the Cocoa domain has always been a reasonable choice 
for 3rd-party developers, and we’d like to keep doing this.


We are looking to improve the experience of throwing such errors (with 
some trivial extensions) by allowing you to do something like `throw 
CocoaError.error(.coderInvalidValue)`. (Of course, don’t quote me on 
the exact syntax, as this is totally subject to change.)

This is pretty simple, and nets you free user-facing localization.

On 15 Mar 2017, at 22:46, Zach Waldowski wrote:

Just a little one that came up with when thinking over the companion 
proposal… see inline.


On Mar 15, 2017, at 6:43 PM, Itai Ferber via swift-evolution 
 wrote:

Foundation-Provided Errors

Along with providing the above encoders and decoders, we would like 
to promote the use of a common set of error codes and messages across 
all new encoders and decoders. A common vocabulary of expected errors 
allows end-users to write code agnostic about the specific 
encoder/decoder implementation they are working with, whether 
first-party or third-party:


extension CocoaError.Code {
public static var coderInvalidValue: CocoaError.Code
public static var coderTypeMismatch: CocoaError.Code
public static var coderReadCorrupt: CocoaError.Code
public static var coderValueNotFound: CocoaError.Code
}


I find all the reasons compelling, but I don’t with the overall 
choice. A common vocabulary, side table info, and localization can be 
achieved with a dedicated error type for coders. I understand ABI 
(sorta) concerns are involved here, that older Foundations will 
continue to return the same NSError codes. However, it's never felt 
appropriate to create my own error instances with NSCocoaErrorDomain, 
and this feels very similar to that. What are our options here?


Zach Waldowski
z...@waldowski.me



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution

Thanks for the comments, David.
I responded to #2 in a separate email, but wanted to get back to 
responding to #1.


In implementing this, I have had the same thoughts. Ideally, one day, we 
would be able to migrate the implementation of this away from the 
compiler to public API (through reflection, property behaviors, or 
similar). If the compiler offers external features that would allow us 
to do everything that we want, I would be more than happy to move the 
implementation from inside the compiler to outside of it.


On 16 Mar 2017, at 0:09, David Hart wrote:


First of all, great proposal :D

Brent, earlier in the thread makes a lot of good points. But I’d 
still like to discuss two subjects:


1) What makes the proposal really stand on its feet compared to 
third-party libraries is the compiler generation magic. I feel divided 
about it. On one hand, this is the only solution today to have this 
level of type and key safety. But on another hand, I have the 
impression that future versions of Swift (with more reflection, 
property behaviours, lenses, etc…) would dramatically affect how 
this subject is treated and implemented. Are you worried that we are 
asking the compiler to do work which might be un-necessary in the 
future? That this topic would be better expressed with more powerful 
language features? Any plans to migrate for this API to smoothly 
migrate to those features in the future?


2) Libraries like Marshal (https://github.com/utahiosmac/Marshal 
) and Unbox 
(https://github.com/JohnSundell/Unbox 
) don’t require the decoding 
functions to provide the type: those functions are generic on the 
return turn and it’s automatically inferred:


func decode(key: Key) -> T

self.stringProperty = decode(key: .stringProperty) // correct 
specialisation of the generic function chosen by the compiler


Is there a reason the proposal did not choose this solution? Its quite 
sweet.



Swift Archival & Serialization
Proposal: SE- 
Author(s): Itai Ferber , Michael LeHew 
, Tony Parker 

Review Manager: TBD
Status: Awaiting review
Associated PRs:
#8124 
#8125 



___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution
Subscripts, by the way, would not help here, since they cannot throw. 
`decode` must be able to throw.
[SR-238](https://bugs.swift.org/browse/SR-238?jql=text%20~%20%22subscript%20throw%22); 
for Apple folks, 28775436.


On 16 Mar 2017, at 11:46, Matthew Johnson via swift-evolution wrote:

> On Mar 16, 2017, at 1:34 PM, Zach Waldowski via swift-evolution 
 wrote:


On Thu, Mar 16, 2017, at 02:23 PM, Matthew Johnson via 
swift-evolution wrote:
I don’t have an example but I don’t see a problem either.  There 
are two options for specifying the return type manually.  We can use 
the signature you used above and use `as` to specify the expected 
type:


let i = decode(.myKey) as Int


The awkwardness of this syntax is exactly what I'm referring to. 
Would a beginner know to use "as Int" or ": Int"? Why would they? The 
"prettiness" of the simple case doesn't make up for how difficult it 
is to understand and fix its failure cases.


Any official Swift or Foundation API shouldn't, or shouldn't need to, 
make use of "tricky" syntax.


I don’t think this is especially tricky.  Nevertheless, we can avoid 
requiring this syntax by moving the type argument to the end and 
providing a default.  But I think return type inference is worth 
supporting.  It has become widely adopted by the community already in 
this use case.




If we don’t support this in Foundation we will continue to see 3rd 
party libraries that do this.


The proposal's been out for less than 24 hours, is it really 
productive to already be taking our ball and go home over such a 
minor thing?


I don’t think that’s what I’m doing at all.  This is a fantastic 
proposal.  I’m still working through it and writing up my more 
detailed thoughts.


That said, as with many (most?) first drafts, there is room for 
improvement.  I think it’s worth pointing out the syntax that many 
of us would like to use for decoding and at least considering 
including it in the proposal.  If the answer is that it’s trivial 
for those who want to use subscripts to write the wrappers for return 
type inference and / or subscripts themselves that’s ok.  But it’s 
a fair topic for discussion and should at least be addressed as an 
alternative that was rejected for a specific reason.




Zach Waldowski
z...@waldowski.me 




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution
I’m going to reply to this thread as a whole — apologies if 
there’s someone’s comment that I’ve missed.


This is something that has come up in internal review, and we’ve 
certainly given it thought. As Zach has already mentioned, the primary 
concern with overloading based on return type is ambiguity.
There are many cases in which Swift’s type system currently does not 
handle ambiguity in the way that you would expect, and it can be very 
surprising. For instance,


```swift
func foo() -> Int { return 42 }
func foo() -> Double { return .pi }
func consumesInt(_ x : Int) { print(x) }

let x = foo() // Ambiguous use of foo()
consumesInt(x) // Even though x is going to be used as an Int
let y: Int = x // Same here
```

`let x = foo() as Int` works now, but it actually didn’t always — 
until a somewhat recent version of Swift AFAICT, the only way to resolve 
the ambiguity was through `let x: Int = foo()`. This has since been 
fixed, but it was very confusing to try to figure out the unambiguous 
way to call it.


Keep in mind that this isn’t an unreasonable thing to want to do:

```swift
struct Foo {
var x: Int
init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)

// Want to process an element before it’s assigned.
let x = container.decode(forKey: .x) // Ambiguous call

// Or whatever.
if x < 0 {
self.x = x + 100
else {
self.x = x * 200
}
}
}
```

You can write `let x: Int = container.decode(…)` or `let x = 
container.decode(…) as Int`, but this isn’t always intuitive.
Consider also that the metatype would also be necessary for 
`decode(_ type: Value.Type, forKey: Key) -> Value` 
because the return value of that certainly could be ambiguous in many 
cases.


Finally, the metatype arg allows you to express the following 
succinctly: `let v: SuperClass = container.decode(SubClass.self, forKey: 
.v)`.


In the general case (`decode`) we would need the 
metatype to avoid ambiguity. It’s not strictly necessary for primitive 
types, but helps in the case of ambiguity, and solves the conceptual 
overhead of "Why do I specify the type sometimes but not others? Why are 
some of these types special? Should I always provide the type? Why 
wouldn’t I?"


Matthew offered `func decode(_ key: Key, as type: T.Type = T.self) 
throws -> T` which looks appealing, but:


1. Doesn’t help resolve the ambiguity either
2. Allows for 3 ways of expressing the same thing (`let x: Int = 
decode(key)`, `let x = decode(key) as Int`, and `let x = decode(key, as: 
Int.self)`)


The cognitive overhead of figuring out all of the ambiguity goes away 
when we’re consistent everywhere.
FWIW, too, I am not convinced that Foundation should add API just 
because 3rd parties will add it. The ambiguity in the general case 
cannot be solved by wrappers, and I would prefer to provide one simple, 
consistent solution; if 3rd parties would like to add wrappers for their 
own sake, then I certainly encourage that.


On 16 Mar 2017, at 11:46, Matthew Johnson via swift-evolution wrote:

> On Mar 16, 2017, at 1:34 PM, Zach Waldowski via swift-evolution 
 wrote:


On Thu, Mar 16, 2017, at 02:23 PM, Matthew Johnson via 
swift-evolution wrote:
I don’t have an example but I don’t see a problem either.  There 
are two options for specifying the return type manually.  We can use 
the signature you used above and use `as` to specify the expected 
type:


let i = decode(.myKey) as Int


The awkwardness of this syntax is exactly what I'm referring to. 
Would a beginner know to use "as Int" or ": Int"? Why would they? The 
"prettiness" of the simple case doesn't make up for how difficult it 
is to understand and fix its failure cases.


Any official Swift or Foundation API shouldn't, or shouldn't need to, 
make use of "tricky" syntax.


I don’t think this is especially tricky.  Nevertheless, we can avoid 
requiring this syntax by moving the type argument to the end and 
providing a default.  But I think return type inference is worth 
supporting.  It has become widely adopted by the community already in 
this use case.




If we don’t support this in Foundation we will continue to see 3rd 
party libraries that do this.


The proposal's been out for less than 24 hours, is it really 
productive to already be taking our ball and go home over such a 
minor thing?


I don’t think that’s what I’m doing at all.  This is a fantastic 
proposal.  I’m still working through it and writing up my more 
detailed thoughts.


That said, as with many (most?) first drafts, there is room for 
improvement.  I think it’s worth pointing out the syntax that many 
of us would like to use for decoding and at least considering 
including it in the proposal.  If the answer is that it’s trivial 
for those who want to use subscripts to write the wrappers for return 
type inference and / or subscripts themselves that’s ok.  But it’s 
a fair topic for discu

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution

Thanks for the thorough and detailed review, Brent! Responses inline.

On 15 Mar 2017, at 21:19, Brent Royal-Gordon wrote:

On Mar 15, 2017, at 3:40 PM, Itai Ferber via swift-evolution 
 wrote:


Hi everyone,

The following introduces a new Swift-focused archival and 
serialization API as part of the Foundation framework. We’re 
interested in improving the experience and safety of performing 
archival and serialization, and are happy to receive community 
feedback on this work.


Thanks to all of the people who've worked on this. It's a great 
proposal.



Specifically:

	• It aims to provide a solution for the archival of Swift struct 
and enum types


I see a lot of discussion here of structs and classes, and an example 
of an enum without associated values, but I don't see any discussion 
of enums with associated values. Can you sketch how you see people 
encoding such types?


For example, I assume that `Optional` is going to get some special 
treatment, but if it doesn't, how would you write its `encode(to:)` 
method?
`Optional` values are accepted and vended directly through the API. The 
`encode(_:forKey:)` methods take optional values directly, and 
`decodeIfPresent(_:forKey:)` vend optional values.


`Optional` is special in this way — it’s a primitive part of the 
system. It’s actually not possible to write an `encode(to:)` method 
for `Optional`, since the representation of null values is up to the 
encoder and the format it’s working in; `JSONEncoder`, for instance, 
decides on the representation of `nil` (JSON `null`). It wouldn’t be 
possible to ask `nil` to encode itself in a reasonable way.


What about a more complex enum, like the standard library's 
`UnicodeDecodingResult`:


enum UnicodeDecodingResult {
case emptyInput
case error
case scalarValue(UnicodeScalar)
}

Or, say, an `Error`-conforming type from one of my projects:

public enum SQLError: Error {
case connectionFailed(underlying: Error)
case executionFailed(underlying: Error, statement: SQLStatement)
case noRecordsFound(statement: SQLStatement)
case extraRecordsFound(statement: SQLStatement)
	case columnInvalid(underlying: Error, key: ColumnSpecifier, 
statement: SQLStatement)
	case valueInvalid(underlying: Error, key: AnySQLColumnKey, 
statement: SQLStatement)

}

(You can assume that all the types in the associated values are 
`Codable`.)
Sure — these cases specifically do not derive `Codable` conformance 
because the specific representation to choose is up to you. Two possible 
ways to write this, though there are many others (I’m simplifying 
these cases here a bit, but you can extrapolate this):


```swift
// Approach 1
// This produces either {"type": 0} for `.noValue`, or {"type": 1, 
"value": …} for `.associated`.

public enum EnumWithAssociatedValue : Codable {
case noValue
case associated(Int)

private enum CodingKeys : CodingKey {
case type
case value
}

public init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
let type = try container.decode(Int.self, forKey: .type)
switch type {
case 0:
self = .noValue
case 1:
let value = try container.decode(Int.self, forKey: .value)
self = .associated(value)
default:
throw …
}
}

public func encode(to encoder: Encoder) throws {
let container = encoder.container(keyedBy: codingKeys.self)
switch self {
case .noValue:
try container.encode(0, forKey: .type)
case .associated(let value):
try container.encode(1, forKey: .type)
try container.encode(value, forKey: .value)
}
}
}

// Approach 2
// Produces `0`, `1`, or `2` for `.noValue1`, `.noValue2`, and 
`.noValue3` respectively.
// Produces {"type": 3, "value": …} and {"type": 4, "value": …} for 
`.associated1` and `.associated2`.

public enum EnumWithAssociatedValue : Codable {
case noValue1
case noValue2
case noValue3
case associated1(Int)
case associated2(String)

private enum CodingKeys : CodingKey {
case type
case value
}

public init(from decoder: Decoder) throws {
if let container = try? decoder.singleValueContainer() {}
let type = container.decode(Int.self)
switch type {
case 0: self = .noValue1
case 1: self = .noValue2
case 2: self = .noValue3
default: throw …
}
} else {
let container = try decoder.container(keyedBy: 
CodingKeys.self)

let type = container.decode(Int.self, forKey: .type)
switch type {
case 3:
l

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-16 Thread Itai Ferber via swift-evolution



On 15 Mar 2017, at 19:12, Joe Groff wrote:


On Mar 15, 2017, at 6:46 PM, Itai Ferber  wrote:

Thanks Joe, and thanks for passing this along!

To those who are curious, we use abstract base classes for a 
cascading list of reasons:


	• We need to be able to represent keyed encoding and decoding 
containers as abstract types which are generic on a key type
	• There are two ways to support abstraction in this way: protocol 
& type constraints, and generic types
		• Since Swift protocols are not generic, we unfortunately cannot 
write protocol KeyedEncodingContainer { ... }, which 
is the "ideal" version of what we're trying to represent

• Let's try this with a protocol first (simplified here):

protocol Container {
associatedtype Key : CodingKey
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {

// return something
}

This looks promising so far — let's try to make it concrete:

struct ConcreteContainer : Container {
typealias Key = K
}

func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {
return ConcreteContainer() // error: Cannot convert return 
expression of type 'ConcreteContainer' to return type 'Cont'

}

Joe or anyone from the Swift team can describe this better, but this 
is my poor-man's explanation of why this happens. Swift's type 
constraints are "directional" in a sense. You can constrain a type 
going into a function, but not out of a function. There is no type I 
could return from inside of container() which would satisfy this 
constraint, because the constraint can only be satisfied by turning 
Cont into a concrete type from the outside.


Okay, well let's try this:

func container... {
return ConcreteContainer() as! Cont
}

This compiles fine! Hmm, let's try to use it:

container(Int.self) // error: Generic parameter 'Cont' could not be 
inferred


The type constraint can only be fulfilled from the outside, not the 
inside. The function call itself has no context for the concrete type 
that this would return, so this is a no-go.


	• If we can't do it with type constraints in this way, is it 
possible with generic types? Yep! Generic types satisfy this without 
a problem. However, since we don't have generic protocols, we have to 
use a generic abstract base class to represent the same concept — 
an abstract container generic on the type of key which dynamically 
dispatches to the "real" subclassed type


Hopes that gives some simplified insight into the nature of this 
decision.


I see. Protocols with associated types serve the same purpose as 
generic interfaces in other languages, but we don't have the 
first-class support for protocol types with associated type 
constraints (a value of type `Container where Key == K`). That's 
something we'd like to eventually support. In other places in the 
standard library, we wrtie the type-erased container by hand, which is 
why we have `AnySequence`, `AnyCollection`, and `AnyHashable`. You 
could probably do something similar here; that would be a bit awkward 
for implementers, but might be easier to migrate forward to where we 
eventually want to be with the language.


-Joe

Yep, that’s a good way to describe it.
We could potentially do that as well, but adding another type like 
`AnyHashable` or `AnyCollection` felt like a much more sweeping change, 
considering that those require some special compiler magic themselves 
(and we’d like to do as little of that as we can).
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] [Proposal] Foundation Swift Encoders

2017-03-15 Thread Itai Ferber via swift-evolution

Hi Will,

Thanks for your comments!
`deferredToDate` simply uses the default implementation that `Date` 
provides — since it is not a primitive type like `Int` or `String` and 
conforms to `Codable` itself, it will have an implementation of 
`init(from:)` and `encode(to:)`. It will have an implementation that 
makes sense for `Date` in general, but since a big use-case for JSON 
lies in talking to external servers which you don't control, allowing 
for custom date formatting is important.


To that end, although ISO 8601 may make sense for some applications as 
the default, it is less efficient to format, encode, decode, and parse 
than, say, writing out a UNIX timestamp as a `Double` (or similar). 
Thus, the default is to allow `Date` to pick a representation that best 
suits itself, and if you need customization, you have the option to use 
it.


Since `Date` makes a concrete decision about how to encode, both sides 
will need to use `deferredToDate` for compatibility, in the same way 
that they would have to agree about ISO 8601, or any of the other 
options.


HTH!

— Itai

P.S. About Xcode autocompletion slowdown, I don't know to be honest, but 
I can't imagine it would be significant. Only certain types have 
`enc...` or `dec...` and even then, the list of methods isn't _that_ 
long.


On 15 Mar 2017, at 18:44, Will Stanton wrote:


Hello,

+1

This proposal seems helpful in standardizing how JSON objects can be 
written, and I commonly encode+decode JSON. The standard library JSON 
and PLIST encoders of Python are a strength, and Swift should be able 
to handle both formats just as easily. Still reading 'Swift Archival & 
Serialization’, but I believe both proposals will improve the safety 
and saneness of serializing/deserialization.


For the JSON coder, how does `deferredToDate` work? Would both the 
writer and reader have to agree to use `deferredToDate`?
Might it be better to force clients to pick a ‘real’ strategy? Why 
not default to one of the formats, perhaps ISO-8601?


(Not too important but also curious how much of a slowdown there will 
be when Xcode/SourceKit tries to autocomplete ‘enc’ or ‘dec’ 
for the Swift Archival & Serialization proposal?)


Regards,
Will Stanton

On Mar 15, 2017, at 6:43 PM, Itai Ferber via swift-evolution 
 wrote:


Hi everyone,
This is a companion proposal to the Foundation Swift Archival & 
Serialization API. This introduces new encoders and decoders to be 
used as part of this system.

The proposal is available online and inlined below.

— Itai

Swift Encoders
• Proposal: SE-
• Author(s): Itai Ferber, Michael LeHew, Tony Parker
• Review Manager: TBD
• Status: Awaiting review
• Associated PRs:
• #8124
Introduction
As part of the proposal for a Swift archival and serialization API 
(SE-), we are also proposing new API for specific new encoders 
and decoders, as well as introducing support for new Codable types in 
NSKeyedArchiver and NSKeyedUnarchiver.


This proposal composes the latter two stages laid out in SE-.

Motivation
With the base API discussed in SE-, we want to provide new 
encoders for consumers of this API, as well as provide a consistent 
story for bridging this new API with our existing NSCoding 
implementations. We would like to offer a base level of support that 
users can depend on, and set a pattern that third parties can follow 
in implementing and extending their own encoders.


Proposed solution
We will:

	• Add two new encoders and decoders to support encoding Swift 
value trees in JSON and property list formats
	• Add support for passing Codable Swift values to NSKeyedArchiver 
and NSKeyedUnarchiver, and add Codable conformance to our Swift value 
types

Detailed design
New Encoders and Decoders

JSON

One of the key motivations for the introduction of this API was to 
allow safer interaction between Swift values and their JSON 
representations. For values which are Codable, users can encode to 
and decode from JSON with JSONEncoder and JSONDecoder:


open class JSONEncoder {


// MARK: Top-Level Encoding



/// Encodes the given top-level value and returns its JSON 
representation.



///


/// - parameter value: The value to encode.


/// - returns: A new `Data` value containing the encoded JSON data.


/// - throws: `CocoaError.coderInvalidValue` if a non-comforming 
floating-point value is encountered during archiving, and the 
encoding strategy is `.throw`.



/// - throws: An error if any value throws an error during encoding.

open
func encode(_ value: Value) throws -> Data



// MARK: Customization



/// The formatting of the output JSON data.


public enum OutputFormatting {


/// Produce JSON compacted by removing whitespace. This is the 
default formatting.



case
 compact


/// Produce human-readable JSON with indented output.


case
 prettyPrinted

}



/// The strategy to use for encoding `Date` va

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-15 Thread Itai Ferber via swift-evolution

Thanks Joe, and thanks for passing this along!

To those who are curious, we use abstract base classes for a cascading 
list of reasons:


* We need to be able to represent keyed encoding and decoding containers 
as abstract types which are generic on a key type
* There are two ways to support abstraction in this way: protocol & type 
constraints, and generic types
  * Since Swift protocols are not generic, we unfortunately cannot 
write `protocol KeyedEncodingContainer { ... }`, which 
is the "ideal" version of what we're trying to represent

* Let's try this with a protocol first (simplified here):

  ```swift
  protocol Container {
associatedtype Key : CodingKey
  }

  func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {

// return something
  }
  ```

  This looks promising so far — let's try to make it concrete:

  ```swift
  struct ConcreteContainer : Container {
typealias Key = K
  }

  func container(_ type: Key.Type) 
-> Cont where Cont.Key == Key {
return ConcreteContainer() // error: Cannot convert return 
expression of type 'ConcreteContainer' to return type 'Cont'

  }
  ```

  Joe or anyone from the Swift team can describe this better, but this 
is my poor-man's explanation of why this happens. Swift's type 
constraints are "directional" in a sense. You can constrain a type going 
_into_ a function, but not _out of_ a function. There is no type I could 
return from inside of `container()` which would satisfy this constraint, 
because the constraint can only be satisfied by turning `Cont` into a 
concrete type from the _outside_.


  Okay, well let's try this:

  ```swift
  func container... {
return ConcreteContainer() as! Cont
  }
  ```

  This compiles fine! Hmm, let's try to use it:

  ```swift
  container(Int.self) // error: Generic parameter 'Cont' could not be 
inferred

  ```

  The type constraint can only be fulfilled from the outside, not the 
inside. The function call itself has no context for the concrete type 
that this would return, so this is a no-go.


* If we can't do it with type constraints in this way, is it possible 
with generic types? Yep! Generic types satisfy this without a problem. 
However, since we don't have generic protocols, we have to use a generic 
abstract base class to represent the same concept — an abstract 
container generic on the type of key which dynamically dispatches to the 
"real" subclassed type


Hopes that gives some simplified insight into the nature of this 
decision.


On 15 Mar 2017, at 18:18, Joe Groff wrote:


Congrats on getting this out! A question from the field:

https://twitter.com/mdiep/status/842178457115230210 Why does the Swift 
Serialization API proposal use abstract base classes?


-Joe


On Mar 15, 2017, at 3:40 PM, Itai Ferber via swift-evolution 
 wrote:


Hi everyone,

The following introduces a new Swift-focused archival and 
serialization API as part of the Foundation framework. We’re 
interested in improving the experience and safety of performing 
archival and serialization, and are happy to receive community 
feedback on this work.
Because of the length of this proposal, the Appendix and Alternatives 
Considered sections have been omitted here, but are available in the 
full proposal on the swift-evolution repo. The full proposal also 
includes an Unabridged API for further consideration.


Without further ado, inlined below.

— Itai

Swift Archival & Serialization
• Proposal: SE-
• Author(s): Itai Ferber, Michael LeHew, Tony Parker
• Review Manager: TBD
• Status: Awaiting review
• Associated PRs:
• #8124
• #8125
Introduction
Foundation's current archival and serialization APIs (NSCoding, 
NSJSONSerialization, NSPropertyListSerialization, etc.), while 
fitting for the dynamism of Objective-C, do not always map optimally 
into Swift. This document lays out the design of an updated API that 
improves the developer experience of performing archival and 
serialization in Swift.


Specifically:

	• It aims to provide a solution for the archival of Swift struct 
and enum types
	• It aims to provide a more type-safe solution for serializing to 
external formats, such as JSON and plist

Motivation
The primary motivation for this proposal is the inclusion of native 
Swift enum and struct types in archival and serialization. Currently, 
developers targeting Swift cannot participate in NSCoding without 
being willing to abandon enum and structtypes — NSCoding is an 
@objc protocol, conformance to which excludes non-class types. This 
is can be limiting in Swift because small enums and structs can be an 
idiomatic approach to model representation; developers who wish to 
perform archival have to either forgo the Swift niceties that 
constructs like enumsprovide, or provide an a

Re: [swift-evolution] [Proposal] Foundation Swift Archival & Serialization

2017-03-15 Thread Itai Ferber via swift-evolution

Thanks Zach, that's encouraging to hear!
Apologies for the original message size — to be honest I didn't 
realize how large it was myself until my own email was delayed in 
moderation.


I guess, then, **when responding to the original email, please quote 
only the sections that you are responding to to best ensure that your 
email makes it through.** Sorry for the inconvenience!


On 15 Mar 2017, at 17:45, Zach Waldowski via swift-evolution wrote:


Holy cow. There's much to digest here (so much so that my initial
response, which quoted its content, was denied by the mailing list).
After an initial reading, I don't just want this now, I want it
yesterday. I'm already imaging the encoder/decoders I want to build.
Very exciting.


Best,

  Zachary Waldowski

  z...@waldowski.me




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution


Re: [swift-evolution] Pitch: Replacement for FileHandle

2017-02-15 Thread Itai Ferber via swift-evolution

FYI, Tony is the manager of the Foundation team. :)
We care very much about making sure that the experience of using our 
framework is a positive one — the more Radars we get, the better we 
can prioritize improving APIs that are not working as well as they could 
be for our users. Even if the Radar gets duped to an existing one, thats 
one more +1 for that Radar saying "this is a problem".


I will search for your old Radar, but in any case, please do file a new 
one about this, and about any other issues you have, because we are 
indeed listening.


On 15 Feb 2017, at 10:41, Charles Srstka via swift-evolution wrote:

> On Feb 15, 2017, at 10:52 AM, Tony Parker  
wrote:


Hi Charles,

Have you happened to file a radar for Foundation that I can look up 
(for both this and process)?


We are working hard on making sure that our API is right for Swift, 
and areas like this where we can make fairly trivial improvements are 
things that we can try to prioritize. As you say, the purpose of 
swift-corelibs-foundation is to present a unified API and prevent the 
need to fork. That means the best possible solution is to improve the 
API in Objective-C (where exceptions-as-control flow is wrong too) 
first, and then naturally flow that into Swift.


Thanks,
- Tony


Honestly? I don’t remember. I feel like I would have sometime around 
2005 or 2006 or so, since that’s probably about when this first 
started bothering me, but it’s been a decade, so my memory is hazy. 
I do know that in the meantime, “considered harmful”-type articles 
have been written about these classes, like this one from eight years 
ago:


https://mikeash.com/pyblog/friday-qa-2009-11-13-dangerous-cocoa-calls.html 



If you think it will actually prompt someone to change this, I’d be 
happy to write up a new one, although I figure it would almost 
certainly just get flagged as a duplicate.


Charles




___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution
___
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution