3 Ways to Improve your Regression with Data Science and Machine Learning Part 2
 (no charge, Case Study, Step-by-step, Hands-on option)


Registration: http://hubs.ly/H01Y9BD0

Alternative Link: 
http://info.salford-systems.com/3-ways-to-improve-your-regression-part2


January 27th, 10AM - 11AM PT

*         If the time is inconvenient, please register and we will send you a 
recording

*         Part 1 is not required, to understand the approach and concepts in 
tomorrow's webinar; but, if you want a refresher, you can see last week's 
webinar at your convenience.  Link to recording of Part 1: 
http://hubs.ly/H01VzrN0

ABSTRACT:
Last week, we showed you how you could drastically improve prediction accuracy 
in your linear  regression with a new model that handles missing values, 
interactions, AND nonlinearities in your data.  As a follow-up to the last 
week's webinar, we will show you how to take data science techniques even 
further to extract actionable insight and take advantage of advanced modeling 
features. You will walk away with several different methods to turn your 
ordinary regression into an extraordinary regression!

Techniques used:

*         Stochastic gradient boosting: TreeNet plots show you the impact of 
every variable in your model; take it a step further by creating spline 
approximations to these variables and using them in a conventional linear 
regression for a boosted model performance!

*         Nonlinear regression splines: MARS nonlinear regression will still 
give you what looks like a standard regression equation, but instead of 
coefficients, you'll see transformations of your original variables.

*         Modeling automation: learn how to cycle through numerous modeling 
scenarios automatically to discover best-fit parameters.
Included with Registration:

*         On-demand recording of webinar

*         Data set used in presentation

*         Step-by-step instructions

*         30-day free access to MARS, TreeNet, and Random Forests

More details:

*         Last week, we showed you how you could drastically improve prediction 
accuracy in your linear  regression with a new model that handles missing 
values, interactions, AND nonlinearities in your data.  This week, we will 
rebuild these original models and get straight to the more advanced features.

*         We will quickly review how to incorporate nonlinearities in a 
regression splines model  AND THEN show you how to automatically detect 
interactions and include these to lead to an even better result.

*         We will quickly review stochastic gradient boosting, and how, with 
plots you can see how each variable contributes to your model.   And then, this 
week you will see how to create approximations from these plots and use these 
in a standard linear regression as your inputs.

*         We will also explore the benefits of model automation. Without any 
custom programming, you can quickly cycle through different modeling scenarios, 
such as intelligently decreasing your predictor pool by removing variables one 
by one, or automatically re-running your regression model using different loss 
functions. This gives you the option to create many different models and choose 
the best for your analysis needs.



These techniques are great for skeptics who like to stick with standard 
regression but wish to see dramatic improvements. With very large datasets, you 
will see a significant speed benefit as well.  Learn what is being used at some 
of the largest banks and credit companies in the world.



And if you want a refresher, you can see last week's webinar at your 
convenience:


Who should attend:

*         Attend if you want to implement data science techniques even without 
a data science, programming, or even a statistical background.

*         Attend if you want to understand why data science techniques are so 
important for analysts.


Registration: http://hubs.ly/H01Y9BD0

Alternative Link: 
http://info.salford-systems.com/3-ways-to-improve-your-regression-part2

Reply via email to