Re: [OT] Comodo EV SSL

2017-07-10 Thread Greg Keogh
Ah, you can automate the process by running an "agent" (a service?) on your
server. I see Comodo also offer a free 90 day certificate. This could be a
weekend project to apply to my personal domain.

As a side issue ... last time I tried to get IIS on 2012 to allow both http
and https to the same domain it went haywire in incomprehensible ways and I
reverted back to https only. I presume there was some simple trick I missed
despite the searches for help. I'll have to battle that problem again as
I'd want both working on my hobby site.

*GK*

On 11 July 2017 at 15:34, Wallace Turner  wrote:

> >It's a shame they only last 90 days
>
> that *is* the feature - you set up your server to auto-renew the cert
> (every 60 days so if theres a problem you have 30 more to sort)
> so right now on my server i have a scheduled task that checks every day
> for a renewal.
>
> read more here:
> https://letsencrypt.org/2015/11/09/why-90-days.html
> i like point 2)
> >They encourage automation
>
> On Tue, Jul 11, 2017 at 11:23 AM, Greg Keogh  wrote:
>
>> is free cheap enough ?
>>> https://letsencrypt.org/
>>>
>>> b4 you bag it out read the faq
>>> https://letsencrypt.org/docs/faq/
>>>
>>
>> Quite surprising! It's a shame they only last 90 days
>>
>> I eventually got the truth out of one of the Comodo sales people that the
>> cheapest EV cert was a "Positive EV SSL" (whatever the hell that is in
>> their overly large product range). Cost $US149/year, which was within my
>> tolerance limits. So it's in an running.
>>
>>  -- *GK*
>>
>
>


Re: [OT] Comodo EV SSL

2017-07-10 Thread Wallace Turner
>It's a shame they only last 90 days

that *is* the feature - you set up your server to auto-renew the cert
(every 60 days so if theres a problem you have 30 more to sort)
so right now on my server i have a scheduled task that checks every day for
a renewal.

read more here:
https://letsencrypt.org/2015/11/09/why-90-days.html
i like point 2)
>They encourage automation

On Tue, Jul 11, 2017 at 11:23 AM, Greg Keogh  wrote:

> is free cheap enough ?
>> https://letsencrypt.org/
>>
>> b4 you bag it out read the faq
>> https://letsencrypt.org/docs/faq/
>>
>
> Quite surprising! It's a shame they only last 90 days
>
> I eventually got the truth out of one of the Comodo sales people that the
> cheapest EV cert was a "Positive EV SSL" (whatever the hell that is in
> their overly large product range). Cost $US149/year, which was within my
> tolerance limits. So it's in an running.
>
>  -- *GK*
>


Re: [OT] Comodo EV SSL

2017-07-10 Thread Greg Keogh
>
> is free cheap enough ?
> https://letsencrypt.org/
>
> b4 you bag it out read the faq
> https://letsencrypt.org/docs/faq/
>

Quite surprising! It's a shame they only last 90 days

I eventually got the truth out of one of the Comodo sales people that the
cheapest EV cert was a "Positive EV SSL" (whatever the hell that is in
their overly large product range). Cost $US149/year, which was within my
tolerance limits. So it's in an running.

 -- *GK*


Re: AZURE SQL Data Sync

2017-07-10 Thread Scott Barnes
That helped, thanks for that.

I'm in the same boat with the inbound data (writes back to onPrem), for me
I'm probably inclined given the sensitivity of the data to run it through a
validation pipeline and feed it through as if it were input via original
source. It could be wasteful but in the end, it can be a good point for
telemetry to track the injection process in a finite way.




---
Regards,
Scott Barnes
http://www.riagenic.com

On Tue, Jul 11, 2017 at 12:01 PM, Greg Low (罗格雷格博士) 
wrote:

> Yep, it might well do that. Other option to consider might be SQL Server
> replication to an Azure SQL DB (if that’s not supported yet, it’s about to
> be).
>
>
>
> But the one that I’ve used very successfully when I need a read-only copy
> of some of the on-premises data in the cloud is:
>
>
>
>- Create the Azure SQL DB
>- In the on-premises SQL box, create a linked server to the Azure SQL
>DB
>- Use a SQL Agent job (or similar) to just push the required info up
>into the Azure SQL DB via the linked server.
>
>
>
> We also found a few tricks while doing this. For example, if we had to
> merge the data from on-premises to the Azure SQL DB, instead of doing a
> MERGE command, we just did INSERT commands instead. Then at the cloud end,
> we created an INSTEAD OF trigger to replace the INSERT with a merge. We
> routinely ended up with better performance overall. I’d never do it that
> way in an on-premises box but in this case the latency was the thing I
> needed to avoid, and it’s easier to just push info up and sort it out at
> the other end, rather than trying to work it out from the on-premises end.
>
>
>
> Hopefully we’ll soon get support for External DataSource and External
> Table objects in on-premises SQL. They are already there in Azure. They
> make this experience much better again.
>
>
>
> Regards,
>
>
>
> Greg
>
>
>
> Dr Greg Low
>
>
>
> 1300SQLSQL (1300 775 775 <1300%20775%20775>) office | +61 419201410
> <0419%20201%20410> mobile│ +61 3 8676 4913 <(03)%208676%204913> fax
>
> SQL Down Under | Web: www.sqldownunder.com |http://greglow.me
>
>
>
> *From:* ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-bounces@
> ozdotnet.com] *On Behalf Of *Scott Barnes
> *Sent:* Tuesday, 11 July 2017 9:49 AM
> *To:* ozDotNet 
> *Subject:* Re: AZURE SQL Data Sync
>
>
>
> Greg,
>
>
>
> Awesome response firstly.
>
>
>
> Secondly, The intent of its use is to essentially provide a continuum
> strategy to moving legacy (Asp.net webforms) towards the cloud. The first
> part of the strategy is to move on-prem hosting into VM instance based
> hosting (cloud). In doing this there is a residual database(s) that on-prem
> cannot be moved into the cloud itself.
>
> Example.
> OnPrem there are two databases first being "Financial" and second being
> "Employee". There is also a website that reads/writes to both of these
> databases depending on a variety of contexts. However, the Financial
> database is quite large but the website only uses say 10% of its total
> structure for a specific set of needs (think of it as being 100 tables but
> only 5 get actually used).
>
>
>
> One can move the Employee/Website from OnPrem into Cloud-Based Hosting and
> therefore remove the residual hosting of these two from OnPrem. One can
> also create a copy of the "Financial" Database (initial creation is empty)
> based on the actual used "parts" of the said database (5 tables). The
> website gets its Connection context updated to point at the VM.
>
>
>
> The intent then is to use Data Sync to essentially push/pull data from
> onPrem to the cloud as data either is populated real-time or based off a
> scheduled interval (either option).
>
>
>
> An objective for Azure SQL Data Sync is that its role in this strategy is
> to act as a transport to ensure OnPrem data is kept up to date in the cloud
> so that website(s) can look at the Azure SQL instance for read/writes as if
> the OnPrem were also moved.
>
>
>
> Constraints on writes can also easily be avoided by using a background
> agent that manually feeds writes to the database, so one could also just
> assert that the Data Sync takes a volatile database onPrem and just pushes
> "snapshots in time" of said data into the cloud.
>
>
>
> From what i've read this looks like its in Azure SQL wheel house... but
> given the volatility in Azure's weekly product management its important to
> not assume :)
>
>
>
>
>
>
> ---
> Regards,
> Scott Barnes
> http://www.riagenic.com
>
>
>
> On Mon, Jul 10, 2017 at 8:34 PM, Greg Low (罗格雷格博士) 
> wrote:
>
> Hi Scott
>
>
>
> Up to a few months back, I would have said “run away fast”. But now not so
> sure.
>
>
>
> This was a “product” that stayed in “preview” mode for so very long. Blog
> posts had long ago stopped and many of us for years had been asking if it
> was another product that was just silently dropped without actually being
> put to death.
>
>
>
> But I met with Lindsey Allen last year and when we were discussing it, she
> said that it was going

RE: AZURE SQL Data Sync

2017-07-10 Thread 罗格雷格博士
Yep, it might well do that. Other option to consider might be SQL Server 
replication to an Azure SQL DB (if that’s not supported yet, it’s about to be).

But the one that I’ve used very successfully when I need a read-only copy of 
some of the on-premises data in the cloud is:


  *   Create the Azure SQL DB
  *   In the on-premises SQL box, create a linked server to the Azure SQL DB
  *   Use a SQL Agent job (or similar) to just push the required info up into 
the Azure SQL DB via the linked server.

We also found a few tricks while doing this. For example, if we had to merge 
the data from on-premises to the Azure SQL DB, instead of doing a MERGE 
command, we just did INSERT commands instead. Then at the cloud end, we created 
an INSTEAD OF trigger to replace the INSERT with a merge. We routinely ended up 
with better performance overall. I’d never do it that way in an on-premises box 
but in this case the latency was the thing I needed to avoid, and it’s easier 
to just push info up and sort it out at the other end, rather than trying to 
work it out from the on-premises end.

Hopefully we’ll soon get support for External DataSource and External Table 
objects in on-premises SQL. They are already there in Azure. They make this 
experience much better again.

Regards,

Greg

Dr Greg Low

1300SQLSQL (1300 775 775) office | +61 419201410 mobile│ +61 3 8676 4913 fax
SQL Down Under | Web: www.sqldownunder.com 
|http://greglow.me

From: ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-boun...@ozdotnet.com] On 
Behalf Of Scott Barnes
Sent: Tuesday, 11 July 2017 9:49 AM
To: ozDotNet 
Subject: Re: AZURE SQL Data Sync

Greg,

Awesome response firstly.

Secondly, The intent of its use is to essentially provide a continuum strategy 
to moving legacy (Asp.net webforms) towards the cloud. The first part of the 
strategy is to move on-prem hosting into VM instance based hosting (cloud). In 
doing this there is a residual database(s) that on-prem cannot be moved into 
the cloud itself.

Example.
OnPrem there are two databases first being "Financial" and second being 
"Employee". There is also a website that reads/writes to both of these 
databases depending on a variety of contexts. However, the Financial database 
is quite large but the website only uses say 10% of its total structure for a 
specific set of needs (think of it as being 100 tables but only 5 get actually 
used).

One can move the Employee/Website from OnPrem into Cloud-Based Hosting and 
therefore remove the residual hosting of these two from OnPrem. One can also 
create a copy of the "Financial" Database (initial creation is empty) based on 
the actual used "parts" of the said database (5 tables). The website gets its 
Connection context updated to point at the VM.

The intent then is to use Data Sync to essentially push/pull data from onPrem 
to the cloud as data either is populated real-time or based off a scheduled 
interval (either option).

An objective for Azure SQL Data Sync is that its role in this strategy is to 
act as a transport to ensure OnPrem data is kept up to date in the cloud so 
that website(s) can look at the Azure SQL instance for read/writes as if the 
OnPrem were also moved.

Constraints on writes can also easily be avoided by using a background agent 
that manually feeds writes to the database, so one could also just assert that 
the Data Sync takes a volatile database onPrem and just pushes "snapshots in 
time" of said data into the cloud.

From what i've read this looks like its in Azure SQL wheel house... but given 
the volatility in Azure's weekly product management its important to not assume 
:)



---
Regards,
Scott Barnes
http://www.riagenic.com

On Mon, Jul 10, 2017 at 8:34 PM, Greg Low (罗格雷格博士) 
mailto:g...@greglow.com>> wrote:
Hi Scott

Up to a few months back, I would have said “run away fast”. But now not so sure.

This was a “product” that stayed in “preview” mode for so very long. Blog posts 
had long ago stopped and many of us for years had been asking if it was another 
product that was just silently dropped without actually being put to death.

But I met with Lindsey Allen last year and when we were discussing it, she said 
that it was going to GA. I was not expecting that. And sure enough, there has 
been some life back in the blog, etc. lately, and some updates did occur to the 
code. It’s moved across into the new Azure portal.

It’s still quite a distance from being what I’d really consider a strong 
product but I hope it succeeds as there is a real need for it. One of the 
biggest limitations is the number of replicas involved.

What are you looking to use it for? Often there are better alternatives.

Regards,

Greg

Dr Greg Low

1300SQLSQL (1300 775 775) office | +61 
419201410 mobile│ +61 3 8676 4913 
fax
SQL Down Under | Web: www.sqldownunder.com 
|http://greglow.me

From: ozdotnet-boun...@ozdotnet.com

Re: [OT] Comodo EV SSL

2017-07-10 Thread Wallace Turner
is free cheap enough ?
https://letsencrypt.org/

b4 you bag it out read the faq
https://letsencrypt.org/docs/faq/



On Fri, Jun 23, 2017 at 6:22 AM, Greg Keogh  wrote:

> Why are you using an EV cert?
>>
>
> Because it looks pretty and creates a nice impression.
>
> I could downgrade to a cheaper non-EV option, which is my backup plan. I
> see on their website there's a $117.51 EV option which the sales person
> never mentioned. Typical product and price confusion.
>
> *GK*
>


Re: AZURE SQL Data Sync

2017-07-10 Thread Scott Barnes
Greg,

Awesome response firstly.

Secondly, The intent of its use is to essentially provide a continuum
strategy to moving legacy (Asp.net webforms) towards the cloud. The first
part of the strategy is to move on-prem hosting into VM instance based
hosting (cloud). In doing this there is a residual database(s) that on-prem
cannot be moved into the cloud itself.

Example.
OnPrem there are two databases first being "Financial" and second being
"Employee". There is also a website that reads/writes to both of these
databases depending on a variety of contexts. However, the Financial
database is quite large but the website only uses say 10% of its total
structure for a specific set of needs (think of it as being 100 tables but
only 5 get actually used).

One can move the Employee/Website from OnPrem into Cloud-Based Hosting and
therefore remove the residual hosting of these two from OnPrem. One can
also create a copy of the "Financial" Database (initial creation is empty)
based on the actual used "parts" of the said database (5 tables). The
website gets its Connection context updated to point at the VM.

The intent then is to use Data Sync to essentially push/pull data from
onPrem to the cloud as data either is populated real-time or based off a
scheduled interval (either option).

An objective for Azure SQL Data Sync is that its role in this strategy is
to act as a transport to ensure OnPrem data is kept up to date in the cloud
so that website(s) can look at the Azure SQL instance for read/writes as if
the OnPrem were also moved.

Constraints on writes can also easily be avoided by using a background
agent that manually feeds writes to the database, so one could also just
assert that the Data Sync takes a volatile database onPrem and just pushes
"snapshots in time" of said data into the cloud.

>From what i've read this looks like its in Azure SQL wheel house... but
given the volatility in Azure's weekly product management its important to
not assume :)




---
Regards,
Scott Barnes
http://www.riagenic.com

On Mon, Jul 10, 2017 at 8:34 PM, Greg Low (罗格雷格博士)  wrote:

> Hi Scott
>
>
>
> Up to a few months back, I would have said “run away fast”. But now not so
> sure.
>
>
>
> This was a “product” that stayed in “preview” mode for so very long. Blog
> posts had long ago stopped and many of us for years had been asking if it
> was another product that was just silently dropped without actually being
> put to death.
>
>
>
> But I met with Lindsey Allen last year and when we were discussing it, she
> said that it was going to GA. I was not expecting that. And sure enough,
> there has been some life back in the blog, etc. lately, and some updates
> did occur to the code. It’s moved across into the new Azure portal.
>
>
>
> It’s still quite a distance from being what I’d really consider a strong
> product but I hope it succeeds as there is a real need for it. One of the
> biggest limitations is the number of replicas involved.
>
>
>
> What are you looking to use it for? Often there are better alternatives.
>
>
>
> Regards,
>
>
>
> Greg
>
>
>
> Dr Greg Low
>
>
>
> 1300SQLSQL (1300 775 775 <1300%20775%20775>) office | +61 419201410
> <0419%20201%20410> mobile│ +61 3 8676 4913 <(03)%208676%204913> fax
>
> SQL Down Under | Web: www.sqldownunder.com |http://greglow.me
>
>
>
> *From:* ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-bounces@
> ozdotnet.com] *On Behalf Of *Scott Barnes
> *Sent:* Monday, 10 July 2017 2:25 PM
> *To:* ozDotNet 
> *Subject:* AZURE SQL Data Sync
>
>
>
> Anyone have experience using Azure SQL Data Sync?  Any "If they only put
> this on the back of the brochure" moments that left you with buyers remorse?
>
>
> ---
> Regards,
> Scott Barnes
> http://www.riagenic.com
>


RE: AZURE SQL Data Sync

2017-07-10 Thread 罗格雷格博士
Hi Scott

Up to a few months back, I would have said “run away fast”. But now not so sure.

This was a “product” that stayed in “preview” mode for so very long. Blog posts 
had long ago stopped and many of us for years had been asking if it was another 
product that was just silently dropped without actually being put to death.

But I met with Lindsey Allen last year and when we were discussing it, she said 
that it was going to GA. I was not expecting that. And sure enough, there has 
been some life back in the blog, etc. lately, and some updates did occur to the 
code. It’s moved across into the new Azure portal.

It’s still quite a distance from being what I’d really consider a strong 
product but I hope it succeeds as there is a real need for it. One of the 
biggest limitations is the number of replicas involved.

What are you looking to use it for? Often there are better alternatives.

Regards,

Greg

Dr Greg Low

1300SQLSQL (1300 775 775) office | +61 419201410 mobile│ +61 3 8676 4913 fax
SQL Down Under | Web: www.sqldownunder.com 
|http://greglow.me

From: ozdotnet-boun...@ozdotnet.com [mailto:ozdotnet-boun...@ozdotnet.com] On 
Behalf Of Scott Barnes
Sent: Monday, 10 July 2017 2:25 PM
To: ozDotNet 
Subject: AZURE SQL Data Sync

Anyone have experience using Azure SQL Data Sync?  Any "If they only put this 
on the back of the brochure" moments that left you with buyers remorse?


---
Regards,
Scott Barnes
http://www.riagenic.com