On Wednesday, January 17, 2018 at 2:30:13 PM UTC, Leo wrote:
> Hello everyone, 
> 
> I am implementing a time-dependent Recommender System which applies BPR 
> (Bayesian Personalized Ranking), where Stochastic Gradient Ascent is used to 
> learn the parameters of the model. Such that, one iteration involves sampling 
> randomly the quadruple (i.e. userID, positive_item, negative_item, 
> epoch_index) for n times, where n is the total number of positive feedbacks 
> (i.e. the number of ratings given to all items). But, as my implementation 
> takes too much time for learning the parameters (since it requires 100 
> iterations to learn the parameters), I was wondering if there is a way to 
> improve my code and speed up the learning process of the parameters.
> 
> Please find the code of my implementation (the update function named as 
> updateFactors is the one that learns the parameters, and such that I guess is 
> the one that should be improved in order to speed up the process of learning 
> the parameter values) in the following link: 
> https://codereview.stackexchange.com/questions/183707/speeding-up-the-implementation-of-stochastic-gradient-ascent-in-python

You have a fair chunk of code so I've only taken a quick glance so this could 
just be the tip of the iceberg.

You're clearly using Python 2 as you have print statements, not functions.  If 
you can upgrade to Python 3 is it is superior, see e.g. 
https://www.youtube.com/watch?v=f_6vDi7ywuA

for i in range(1, len(self.userItems[userID])):

is a code smell, consider using

for userItem in self.userItems[userID]:

if len(pos_per_user[userID]) == 0:

is written

if pos_per_user[userID]:

if userID == None:

should be

if userID is None:

Profile your code with https://docs.python.org/3/library/profile.html to find 
where the hot spots are.

Read https://wiki.python.org/moin/PythonSpeed/PerformanceTips for tips on how 
to up your code performance.

--
Kindest regards.

Mark Lawrence.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to