Re: Should we get rid of style insensitivity?

2018-11-23 Thread moerm
We can turn it anyway we like but we must not go against the vital rule that a 
compiler must not ever change an identifier behind the developers back.

leading '_' can be discussed (I'm pro but can accept Nim staying against it).

Case sensitivity and/or rules (like types capital 1st. letter, vars and procs 
not) can be discussed/voted on/dictated by Araq,  One might not like the 
outcome but one can live with it.

"Style insensitivity", however, goes against the vital rule. Simple as that. 
Even a "religious" preference like CamelCase over snake_case is something that 
one may live with if that preference would be important enough to @Araq and the 
core team to ban underscores.

But the "style insensitivity" was a major sin from the start. Period.

If Araq hates snake_case so much he should simply forbid it (i.e. any 
underscores in any identifiers). But allowing them and even dressing it up as 
"liberal" and cool feature - but then having the compiler change identifiers 
and un-snake_case them is a major sin and frankly not acceptable for a good 
language that is concerned with safety and clarity - even more so when a 
language produces code in other languages.

If I wanted to play lottery with the compiler I could have stayed with C.


Re: How to convert integer to pointer

2018-11-23 Thread mashingan
> I just want to check the type at runtime

[https://nim-lang.org/docs/typeinfo.html](https://nim-lang.org/docs/typeinfo.html)


Re: Should we get rid of style insensitivity?

2018-11-23 Thread mashingan
Just because code has uniform _naming-style_ doesn't make it easier to read, 
and vice-versa.

By non-programmer point-of-view, glorious_function and gloriusFunction is 
(about) same, read same, and (almost) look same, and that's only natural.


Re: How to convert integer to pointer

2018-11-23 Thread kcvinu
> Types exist at compile-time only (opposite to Ruby, Python and the like).

I just want to check the type at runtime and want to make some changes in 
program. So i am assuming i cant do it in nim.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread lscrd
Thanks for the precision. But is it really style insensitivity? That’s what I 
thought until I understood that it is only a consequence of ignoring the 
underscores in identifiers as underscores are ignored in numbers.

So Nim style insensitivity may be seen only a consequence of Nim case 
insensitivity and of its special usage of underscores to improve readability in 
numbers and identifiers.

Still, one can dislike it, but it’s not so arbitrary that it might seem.


The future of NIM's WASM?

2018-11-23 Thread HerbertClair
I went from Go and recently had an interest in Web Assembly, but I do not feel 
that Go is particularly good for it, mainly due to the large binary size. So I 
look around for a new language and Nim seems perfect for it, modern, fast and 
makes even binary files even under the web.

Have I found some effort since last year, but have there been recent 
developments or plans for the future?


classic "can not open sdl2" ...

2018-11-23 Thread thedoc
Greetings. I don't know why it is not working.

I have updated to the latest nim. I have made sure that I have the correct dll 
installed, which is the 64bit version. I have placed the dll everywhere where 
it might make sense, including system32 and nimbin and, of course, the local 
directory.

What am I doing wrong?

Thank you.


Re: How to convert integer to pointer

2018-11-23 Thread Stefan_Salewski
mashingan, your above example looks a bit funny and strange to me.

Your printout proc is generic, so due to actual calls with int, string and 
float parameters, we should actually get 3 distinct instances of that proc in 
executable. And actual parameter decides which actual instance is called. But 
then the if statement makes not much sense any more, it should be a when. But 
when is compile time of course. So for me here is no actual runtime test at 
all. 


Re: Should we get rid of style insensitivity?

2018-11-23 Thread cblake
I think a vote would be fine. Ballot box stuffing is possible, but probably 
identifiable via sheer numbers since the Nim community is so small unless the 
vote gets HackerNews'd or Slashdotted. I also agree that the people who matter 
most would never participate in such a vote because they've already dismissed 
Nim for being "too creative" on the identifier front.

Wording matters a lot, too, as in any survey. I'd wager that there would be 
almost zero "yes" answers to a "would you stop using Nim if it became fully 
sensitive?" question. Also, a "do you know at least one developer who never 
tried Nim because one of the first things they heard about was its 'weird' 
ident rules?" would probably get an average of >50% yes's. Developers in very 
niche internet languages are not always the most social creatures or my 
estimate would be higher.

Personally, I like identifier _sensitivity_. I think things that look different 
to a person should be treated as different by the system, especially a system 
used by programmers who know one wrong character somewhere breaks a program. 
Ease of use studies about filesystems by non-programmers are at best weakly 
relevant - and anyway often there is a wrinkle of  "creation time casing" and a 
"reference time casing". Studies in text also show using ALL CAPS ALL THE TIME 
makes characters most easy to distinguish, but programmers tend to not want to 
do that.

In physics and math (where the underlying syntax of adjacency implies 
multiplication) single character variable names like F=ma and subscripting 
prevail. The handwritten/specialty script nature of those fields then pushes 
users to use _more fonts_ and italics and bold face and so on, and you often 
see textbook authors declare their style conventions. In general, more terse 
languages without a lot of syntactic noise (like Nim) benefit more from shorter 
identifiers which in turn benefit more from sensitivity. Speaking in voice 
about "cap A" is as easy as "A zero".

Giving more picky people flexibility to define their own conventions seems good 
to me, and to me that means sensitivity not the current rules. Less picky or 
more overall consistency-oriented people will just go with the flow and imitate 
Nim standard library conventions. Many will always go for camelCase since they 
know they'll have a lot of stdlib calls and the stdlib does that and so they'll 
want their own code to match that "larger world convention". C programmers 
imitating C stdlib and Java the Java APIs and C++ or POSIX programmers the 
POSIX apis and so on drive all this enormously. So, the core Nim community can 
probably get most of what they want just by controlling the style used by the 
stdlib.

Also, one has to pick which battles one fights! _Injudicious_ choice of 
characters will _always_ be a problem. In typical fonts for Latin script 
languages the upper- and lowercase versions of a letter "visually collide" for 
about 50% of the 26 letters. For another 50% there is a lot of visual 
ambiguity, like the similarity of O/0, 1/l { big-Oh vs zero and one vs 
lowercase-ell }. Colon and semi-colon, commas and periods are often really hard 
to see much difference in but make the world of difference. To my knowledge, no 
one suggests in any seriousness that Nim should forbid big-Oh or little-ell 
from identifiers or that writing Nim should require a certain font because 
otherwise it's hard to tell what means what. Sometimes you have to just rely on 
users of any language to not choose to write deliberately confusing things or 
use programming-hostile fonts.

However, it is perfectly fine ( _fantastic_ , even!) if the compiler has one or 
several warning systems (that can be easily turned off by a flag) for _ALL 
SORTS_ of confusing scenarios that "encourage" clarity/simplicity. People "in 
the thick of it" can have the warning be active. People more in their own 
closed world or with habits/inclinations/fonts that make certain mistakes less 
likely can turn it off. Problem solved. Besides naming convention issues, 
uneven spacing around operators is a good example. There's a warning for that 
in place now. Another example without a current warning in place would be 
spaces of leading indentation -- 1 spaces vs 2 vs 3 vs 4 vs 8 -- all currently 
treated the same by Nim. A shift from indents of 1 to 8 is probably some kind 
of bug or at least a highly erratic formatting style. The compiler warning 
could even take some limit where a change of indent shift by <=N was ok and 
warn only for changes bigger than that.


Araq in IRC: for Python migration TableRef will suit you much better than Table

2018-11-23 Thread Stefan_Salewski
[https://irclogs.nim-lang.org/26-10-2018.html#09:38:42](https://irclogs.nim-lang.org/26-10-2018.html#09:38:42)

Can someone explain? 


Re: Araq in IRC: for Python migration TableRef will suit you much better than Table

2018-11-23 Thread mratsim
In Python everything is a ref object. When porting you might get the wrong 
result if you use tables with value semantics instead of TableRef if the 
Table/TableRef is copied.


Re: classic "can not open sdl2" ...

2018-11-23 Thread thedoc
Ah okay, so apparently the sdl.nim is **required** to work, and it is not 
actually asking for the dll (like it did in the old version) but actually asks 
for sdl.NIM ... man ...


Re: classic "can not open sdl2" ...

2018-11-23 Thread thedoc
Okay, so apparently I've mixed up example files ... and it's not the dll that's 
missing, but the package. I've figured it out by looking at the tests in 
nim/tests/.../sdl. I saw "import sdl" and went "oh" That doesn't explain 
why i have the dll problem with csfml, but at least this is probably solved.

Sorry for being dumb.


Deprecation of "round" with two arguments

2018-11-23 Thread lscrd
Hello all,

When compiling (with development version of Nim) a module which uses the 
"round" function from the "math" module, more precisely the "round" function 
with two arguments (the second one being the number of positions after decimal 
point), I got a deprecation warning. The recommended way to round to some 
decimal position is now to use "format".

I looked at "math.nim" to better understand the reason of this deprecation and 
it is said that the function is not reliable because there is no way to 
represent exactly the rounded value as a float. I was aware of this, of course, 
but I don’t see how using "format" could be better. As I don’t want a string 
but a float, I would have to convert the exact string representation to a 
float, losing precision in the operation.

I have done some comparisons to check if, for some value of "x", I could get a 
difference between _x.round(2)_ , _x.formatFloat(ffDecimal, 2).parseFloat()_ 
and the expected result (as a float). I failed to find one, but, of course, I 
cannot be sure that there will never exist a difference.

So, I would like to know if there is an example or a theoretical proof which 
shows that the way "round" works (multiplying, rounding, dividing) may give 
less precise results than using format, then parsing the string to a float 
(which will need several floating point operations). Because, to round a float 
to two digits after decimal point for instance, it seems rather overkill to 
convert to a string (with rounding) then convert back to a float, when one can 
simply multiply by 100, round to integer then divide by 100.


Re: How to convert integer to pointer

2018-11-23 Thread mratsim
For inherited types (i.e. runtime types) you can use `of` instead of `is`.


Re: classic "can not open sdl2" ...

2018-11-23 Thread Stefan_Salewski
When sdl.nim is missing (package not installed) you should get an error at 
compile time. When a dll is missing, you should get an error when you launch 
your executable.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread allochi
Here is a little experiment.

  1. Write a sample code, explicitly use style insensitive in the code
  2. Show it to some colleagues who doesn't know nim
  3. Try to explain to them the code without bringing style insensitive up 
unless they ask
  4. They will definitely ask why you have get_data() and getData()
  5. Explain to them how it doesn't matter
  6. Take their opinion on that



You will not have a problem explaining the code to them since nim is so concise 
and expressive, and I'm sure they will love it.

Now, colleagues read each other code all the time, so maybe instead of asking 
ourselves if we like style insensitivity, we should ask if those whom we work 
with would welcome it into our production pipeline, and wouldn't mind adopting 
nim in a team project.

I'll keep using Nim, regardless of the result of this discussion, and I'm 
looking forward to replace Go completely with Nim (and/or Zig) in the future, 
but one of many things I appreciate in Go is code consistency, and how I can 
read code I wrote years ago, and still understand why I wrote it, and what each 
part means.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread andrea
@allochi that experiment does not really make much sense, because you are not 
supposed to use style insensitivity to mix styles inside a codebase, unless you 
are masochistic. Style insensitivity is used to take a library written in a 
different style and use inside your project without having to adapt to the 
library style


Re: classic "can not open sdl2" ...

2018-11-23 Thread thedoc
It is installed, but it has to be in the same directory as the rest, else it 
does not appear to be recognized. That makes sense, of course. It's as you 
describe now ... and I probably should have upgraded to 0.19 earlier.


Re: How to convert integer to pointer

2018-11-23 Thread alehander42
Hey @kcvinu, what are you trying to do generally with this approach? It seems 
interesting, but it's very rarely needed to do something like that in normal 
Nim code: I think you're trying to recreate some Python/Ruby code patterns, but 
maybe you can solve this in a generally different way (e.g. overloading). 
Usecase always helps!


Re: How to convert integer to pointer

2018-11-23 Thread kcvinu
Thanks man. Great help. :)


Re: How to convert integer to pointer

2018-11-23 Thread kcvinu
Could you please post a code snippet using "of" instead of "is"


Re: Should we get rid of style insensitivity?

2018-11-23 Thread moerm
Side note: Adapting to a libraries style _can_ also been as an advantage 
("Obviously code from elsewhere").

In fact, almost all of those questions _can_ be seen this or that way.

Based on some decades experience I can't remember having felt that some 
libraries naming convention was problematic for me.

If we want freedom - and I guess close to 100% want freedom - then diverse 
styles are a price we shouldn't be too concerned about.

What is beyond personal taste and styles, however, is that whatever (in terms 
of style) we write, a compiler should never ever change it behind our back. The 
compiler may complain, say about a leading underscore, if a language doesn't 
allow it but it must change anything.

Btw. I know the underscore already as (typ.) "digit triples separator" from Ada 
and love it. But those are (numerical) literals and not identifiers and 
"changing" those '_' (ignoring them) in literals is OK and a different matter.

Btw.2 My own experience with non/not yet Nim users is that they find style 
insensitivity exotic, weird, and sometimes funny. I've yet to see a colleague 
to _like_ it and to consider it as advantageous (for the sake of fairness: my 
group probably does not represent the general developer community)


Re: How to convert integer to pointer

2018-11-23 Thread kcvinu
It is just an experiment. I love ArrayList in vb.net and list in python. I am 
trying to implement something like them in Nim. I know, that most of the time 
we can make a user defined type and declare a seq contains that type will work. 
But this is also a good feature of a language.

This is my aim -

  1. Get the pointer of data
  2. Convert it to ByteAddress
  3. Keep it in a seq



Then later loop through the seq, find the ByteAddress with an index. Then 
convert it back to the actual pointer. But afaik, I need to use the type name 
to get the pointer.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread cblake
Does anyone out there routinely use this feature of diverging from the style of 
an `import` or as I mentioned just follow the lib's conventions? Part of 
@dom96's survey should perhaps ask if that aspect is just a "theoretical 
nicety".

I mean, someone cared enough about project-internal style consistency to write 
NEP1 and someone also cared enough to add that `nim c --nep1` system. I think 
if most/all example code/standard and popular libraries use one style that most 
people will copy it.

The reasons they would _not_ imitate it are more likely to be either A) 
technical like wrapping sensitive libs or B) cultural like some corporate style 
guide legislation (of which idents are just one dimension with spacing, 
comments, function size, etc. being many others) or C) strong personal 
preferences. In those cases, they seem more likely to just reject Nim outright, 
but that's just my hunch. C) could cut either way. So, call it 5 out of 6 cases 
of rejection for 1 case of ecstatic acceptance. A 6x larger community would 
make a world of difference...

So, if it's the primary motivation is just that Nim core personally does not 
want to deal with other identifier styles, I doubt there is much value realized 
by style insensitivity. In a fully sensitive world they might have to deal with 
it 5-10% of the time. Yeah, more than zero.

It's hard to know for sure and definitely late in the game, of course. It's 
certainly unique. Maybe it's the killer feature! If it were _easy_ to predict 
popularity then the world would be a _very_ different place.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread gemath
> you are not supposed to use style insensitivity to mix styles inside a 
> codebase

If that was the case, and I wish it was, style insensitivity would at least be 
limited to the `import`, `importcpp` and `importc` statements to be resolved 
with symbol binding.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread moerm
> Does anyone out there routinely use this feature of diverging from the style 
> of an import or as I mentioned just follow the lib's conventions?

I do. I do not feel it to be problematic how the Nim lib naming is but I always 
and consistently use my own style in my own code (CamelCase for vars, 
snake_case for procs). I'd probably also use my own style for types but Nim's 
rule for types (capital 1st. letter and CamelCase) matches mine anyway.

And yes, not being able to keep my own style (that I thought about and 
deliberately chose) would have kept me away from Nim. And I absolutely HATE 
Nims style insensibility; so much so, in fact, that I find myself less 
disinterested in e.g. Zig than I'd like to.


Re: How to convert integer to pointer

2018-11-23 Thread alehander42
@kcvinu This is a very interesting experiment, and I'll be interested in seeing 
if it's possible for it to work well.

Still, keep in mind, that this is just veeery unusal in normal Nim code: I 
doubt I'll ever use anything like that even if it works, because 1) one can use 
normal types / variants in 95% of the cases 2) it's probably slower 3 ) it 
_seems_ very type unsafe. It might make sense for Python/Ruby, but it's not 
needed for Nim code.

But, if you look at it as just an experiment, it's pretty interesting. 


Re: Deprecation of "round" with two arguments

2018-11-23 Thread Araq
We decided that this variant of round is almost never what you should use. The 
stdlib needs to avoid procs that trick you into programming bugs. If you 
**really** need it, use this code:


proc myRound*[T: float32|float64](x: T, places: int): T=
  if places == 0:
result = round(x)
  else:
var mult = pow(10.0, places.T)
result = round(x*mult)/mult


Run

(That is the stdlib's implementation.)


Re: How to convert integer to pointer

2018-11-23 Thread yglukhov
Here's something that could help: 
[https://github.com/yglukhov/variant](https://github.com/yglukhov/variant). And 
+1 in most cases you can/should go with static typing.


Re: The future of NIM's WASM?

2018-11-23 Thread Araq
Our plans for the future are more general but improve the support for wasm as a 
side-effect:

  * Make the language more agnostic to the used underlying allocators 
(destructors).
  * Introduce an optional code transformation that maps exception handling to 
`if` statements.
  * Make the tracing garbage collection precise with respect to stack roots. 
This means Nim's GCs will get rid of hardware specific code that doesn't port 
well to wasm.



Whether you compile the Nim code to wasm via C/Emscripten or via Nim's LLVM 
backend (nlvm) does not matter, these toolchains will all benefit from the 
outlined changes. Note that a direct wasm backend is currently not planned, but 
I expect the community to provide one. A Nim-to-wasm translator is a nice 
greenfield project to learn about compilers. 


Re: How to convert integer to pointer

2018-11-23 Thread mashingan
> But then the if statement makes not much sense any more, it should be a when. 
> But when is compile time of course. So for me here is no actual runtime test 
> at all.

I thought `if` is used in run-time while `when` is for compile-time, is not 
like that?


Re: The future of NIM's WASM?

2018-11-23 Thread mashingan
I don't know, but this post already asked several months ago and 90% of post 
wording is same, :/ 
[https://forum.nim-lang.org/t/4049](https://forum.nim-lang.org/t/4049)


Re: The future of NIM's WASM?

2018-11-23 Thread Araq
What's new is my answer. :P


Re: Should we get rid of style insensitivity?

2018-11-23 Thread allochi
One person wouldn't use multiple styles, a big team of developers, it's 
different story.

Imagine in real big project, when you hire consultants to work on part of a 
project, and they write their code in a different style than that of the team, 
which they are using for months, and then you need to debug that code - I have 
been there multiple times with my consultants, mistakes happen out of habits.

@araq and @dom96, maybe a tool would solve the problem, like an option in fmt 
that says reformat all identifiers into a certain style, this way code can be 
imported and transformed into one single style.

One problem with my suggestion. It's easy to convert between camel case myVar 
and snake case my_var, but not when it's not obvious myvar, since these all are 
the same variable.

The funny part, I saw the vote on twitter, and by mistake voted "Yay" XD


Re: How to convert integer to pointer

2018-11-23 Thread kcvinu
Can i put data types like HMENU, HWND, HPEN, HBRUSH, in a variant ? 


Re: Should we get rid of style insensitivity?

2018-11-23 Thread andrea
@gemath Actually, a pure Nim library could not respect NEP1 and follow a 
different naming style. If I use such library, I will call it using NEP1 
identifiers - no importc in sight.

I think it could make sense to add a warning in the compiler for the case where 
mixed styles are used **inside** the same project (that is, not in code coming 
from nimble libraries)


Re: How to convert integer to pointer

2018-11-23 Thread yglukhov
You could put those even into an int. But yeah, sure. Though note that Variant 
will not distinguish type aliases.


Re: Deprecation of "round" with two arguments

2018-11-23 Thread lscrd
Yes, I agree that this function is not generally what is needed. Most of the 
time, we want a string and "format" is what should be used. I didn’t wanted to 
discuss the decision, I was just curious to know if there exists situations 
where it actually gives a wrong result.

Now, in my case, this is not a big deal. I need only rounding to 0, 1 and 2 
decimals at three places, so I changed to use explicit code: `round(x)`, 
`round(10 * x) / 10`, `round(100 * x) / 100`.


Re: How to convert integer to pointer

2018-11-23 Thread Stefan_Salewski
> I thought if is used in run-time while when is for compile-time, is not like 
> that?

Generally it is of course.

But I just tested what I wrote above, and my memory was indeed correct: "echo 
(a +1)" does not compile for your example code, but replacing if with when 
makes it compile again. As the proc is generic, type of parameter a is know at 
compile time for each instance of proc, so the test "if a is int" compiles but 
makes not much sense.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread didlybom
Completely agree. If that is the main benefit, why not focus on giving a non 
controversial solution to that particular problem?


Re: Should we get rid of style insensitivity?

2018-11-23 Thread metasyn
Although I do not use Nim everyday (I'd like to!) - I've found that the style 
insensitivity is by far the biggest obstacle I've found in Nim adoption at my 
company.

I think it might be easier to overlook if there was a tool that could that 
could format the code in one way or another, or a linter of sorts that would 
enforce a particular style?


Has anyone written something like 'Expect' in Nim?

2018-11-23 Thread bobg
Has anyone written something like 'Expect' in Nim?

Expect is a tool for wrapping command-line commands with a parser that can 
execute the command and then interpret the results.

Commonly used as one of the basic pieces for constructing command execution 
tools like ansible or fabric.

(see 'Exploring Expect' by Don Libes - very old book)


Memory regions / gc:regions

2018-11-23 Thread carterza
Does ANYONE know how to use memory regions beyond the examples that are 
available on the forum? These examples tend to consist of -


var rx: MemRegion

withRegion rx:
  discard 0


Run

While this example code proves regions work - it doesn't prove much beyond 
that. There is no - if any - mention of them in the docs.

The module itself claims - # Stacks can also be moved from one thread to 
another. \- but there are not tests I can find involving regions or threads. 
[https://github.com/nim-lang/Nim/blob/devel/lib/system/gc_regions.nim#L33](https://github.com/nim-lang/Nim/blob/devel/lib/system/gc_regions.nim#L33)

Does anyone besides Araq know how to use these regions successfully? If so - 
could some of that knowledge be disseminated so that I, or someone else, could 
work on documentation for this feature?

If no one knows how to use them properly - and we don't plan on documenting any 
time soon - is there any real reason to have them or include them?

I have a lot of difficulty finding any real use of them on github...


Re: Should we get rid of style insensitivity?

2018-11-23 Thread runvnc
When people say it's an obstacle for adoption, exactly in what way? Do you mean 
that they actually tried to compile something, and it failed because of 
identifiers being considered the same in Nim? Or just that they looked at the 
web page, saw that and that is the thing they decided to use as the reason they 
we're giving? Because in the first case, it seems it would be better to 
differentiate the identifiers with say one more deacriptive, and in the second 
case they may just not want to learn a new language anyway but didn't want to 
say that.

However if it is the first case and they couls not change the codebase because 
it was too many other identifiers or something, I'd that occurs frequently it 
could be a case for making it optional. But we would need to confirm that they 
actually tried and had a practical issue that wasn't better resolved with just 
making identifiers more specific.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread runvnc
The reason the C sometimes relied upon underscores at one time was because in 
the good old days many system only supported one case. Now that all modern 
systems have upper and lower case, it makes sense to use an initial uppercase 
letter to use the same descriptive for a type rather than variable.


Re: Should we get rid of style insensitivity?

2018-11-23 Thread mashingan
> I've found that the style insensitivity is by far the biggest obstacle I've 
> found in Nim adoption at my company.

I can definitely say it's incorrect finding. 99% companies like to play safe, 
use whatever popular without really understand pros/cons of what they're using. 
Because they don't want a super good programmer but a bunch of mediocre 
programmers, so they can replace programmer whenever they want.

It's not about style/case-insensitivity, but rather how popular it is. So it's 
back to a **killer app** , if someone produces killer-app using Nim, the 
popularity should skyrocket and whatever the wart, whether they like it or not, 
they will use it (see JavaScript)


Re: Should we get rid of style insensitivity?

2018-11-23 Thread GULPF
> I think it might be easier to overlook if there was a tool that could that 
> could format the code in one way or another, or a linter of sorts that would 
> enforce a particular style?

`--styleCheck:hint` or `--styleCheck:error` can be passed to the compiler which 
enforces that all symbols in a project uses camel case.