The code is incorrect, $ERROR should be

 $ERROR
 LOQ=0.1
 IPRED = F
 W1 = THETA(8)*IPRED
 W2 = THETA(9)
 W = SQRT(W1**2+W2**2)
 IRES = DV - IPRED
 IWRES = IRES/W
 DUM = (LOQ -IPRED)/W
 CUMD = PHI(DUM)
 IF(BLQ.EQ.0) THEN
 F_FLAG=0
 Y= IPRED +W1*ERR(1) + W2*ERR(2)
 ELSE
 F_FLAG=1
 Y=CUMD
 MDVRES = 1
 ENDIF


My be it makes sense to show the entire $PK and $EST blocks: easier to check. Which ADVAN do you use? If ADVAN6, switch to ADVAN13

Use MATRIX=S on $COV step
Use NOABORT on $EST step

try to use NSIG=4 SIGL=12 on $EST with LAPLACEAN

For pre-dose samples, do you mean pre-first dose? Then those should be ignored, removed from the data set, or use EVID=2 MDV=1, then they will be ignored at estimation. After washout, if all samples are BQLs, I would ignore them in a similar way, at least initially. For washout setting, all versions are fine, You can use TIME=0 EVID=4 for the first dose after washout.

To check the code, I would first remove BQLs (use MDV=1 EVID=2 for those, to get IPRED), use FOCEI rather than LAPLACEAN, and make sure that the model fit is good. Then check that IPRED at the points of BQLs is small. If not, check whether those BQLs are reasonable or could be data errors. Then switch to LAPLACEAN with initial values of all parameters set at the final values of the previous FOCEI run. Make sure you use INTERACTION options for all runs.

Good luck!
Leonid




On 5/29/2023 8:02 PM, Hiba Sliem wrote:
Hi

Thank you for your assistance, unfortunately I still  haven't found a solution 
to my issue, and I keep running into either one of these error messages 
depending on my dataset codification:

#PROGRAM TERMINATED BY OBJ
  ERROR IN NCONTR WITH INDIVIDUAL       6   ID= 6.00000000000000E+00
  NUMERICAL HESSIAN OF OBJ. FUNC. FOR COMPUTING CONDITIONAL ESTIMATE
  IS NON POSITIVE DEFINITE


#R MATRIX ALGORITHMICALLY SINGULAR
  AND ALGORITHMICALLY NON-POSITIVE-SEMIDEFINITE
0R MATRIX IS OUTPUT
0COVARIANCE STEP ABORTED


I finally managed to have both estimation and covariance not fail with the 
following error code :
$ERROR
LOQ=0.1
IPRED = F
W1 = THETA(8)*IPRED
W2 = THETA(9)
IRES = DV - IPRED
IWRES = IRES/(W1 + W2)
DUM = (LOQ -IPRED)/(W1 + W2)
CUMD = PHI(DUM)
IF(BLQ.EQ.0) THEN
F_FLAG=0
Y= IPRED +W1*ERR(1) + W2*ERR(2)
ELSE
F_FLAG=1
Y=CUMD
MDVRES = 1
ENDIF

$THETA
   (0.01, 0.38) ; [w1]
   (0.01, 0.1) ; [w2]

$SIGMA
   1 FIX ;[P] sigma(1,1)
    1 FIX ;[P] sigma(2,2)

However my results were accompanied by the following message:

MINIMIZATION SUCCESSFUL
  HOWEVER, PROBLEMS OCCURRED WITH THE MINIMIZATION.
  REGARD THE RESULTS OF THE ESTIMATION STEP CAREFULLY, AND ACCEPT THEM ONLY
  AFTER CHECKING THAT THE COVARIANCE STEP PRODUCES REASONABLE OUTPUT.


I think a part of the issue is the way I've been formatting my dataset, since I 
get different results depending on the way I've set it up, so I'd like to have 
your opinion on the best way to proceed in the following situations:
  >predose samples that are BLQ or higher at Time of dosing:
ID  TIME DV    AMT   BLQ
1    0       0.1     0        1
1    0      0        10       0

Do I  keep the original times or do something like that?:

ID  TIME       DV    AMT   BLQ
1    -0.01       0.1     0        1
1     0              0        10       0

And can I ignore these observations altogether (using something like MDV=100 
for example)?

Washout period followed by a predose sample and a new administration of the 
same drug (different dose):

                                   ID      TIME     DV     AMT    BLQ      EVID
                                    1        0           0        10        0   
         1
                                    1        1          0.5     0          0    
         0
                                     1       ....        ...        ....        
....         ....
After washout >       1      1000      0.1     0          1             0
                                    1      1000      0         20        0      
       1

Do I leave it inchanged, or use EVID4 either this way:
                                   ID      TIME     DV     AMT    BLQ      EVID
                                    1        0           0        10        0   
         1
                                    1        1          0.5     0          0    
         0
                                     1       ....        ...        ....        
....         ....
                                    1      1000      0.1     0          1       
      0
                                    1            0      0         20        0   
          4

Or that way?:
                                   ID      TIME     DV     AMT    BLQ      EVID
                                    1        0           0        10        0   
         1
                                    1        1          0.5     0          0    
         0
                                     1       ....        ...        ....        
....         ....
                                     1        0        0         20        0    
         4
                                    1         0      0.1        0          1    
         0

Sorry if these all seem like obvious questions, but I've been struggling to get 
satisfying results over the last few days and I'd like to understand what I've 
been doing wrong.

Kind regards,

-----Original Message-----
From: Philip Harder Delff <phi...@delff.dk>
Sent: Friday, 26 May 2023 21:42
To: Leonid Gibiansky <lgibian...@quantpharm.com>
Cc: Hiba Sliem <hiba.sl...@pharmalex.com>; nmusers <nmusers@globomaxnm.com>
Subject: Re: [NMusers] Problem with estimating sigma when using M3 method

[You don't often get email from phi...@delff.dk. Learn why this is important at 
https://aka.ms/LearnAboutSenderIdentification ]

Hi Hiba,

I agree that often the issues should be found in the data rather than the 
model. I recommend checking the data with NMcheckData from the R package called 
NMdata. It scans for a long list of potential issues, some that will make 
Nonmem fail, some that won't. If it finds issues, they will be returned in a 
data.frame with reference to row numbers and ID's so you can easily identify 
the root cause. If you look at ?NMcheckData you may identify arguments you can 
specify to add to the list of checks the function can run.

Having said this, a data/model issue can also be that your data poorly supports 
estimation of parts of your model (practical identifiability).
NMcheckData won't help you identify such issues.

NMdata: https://philipdelff.github.io/NMdata/
NMcheckData manual:
https://philipdelff.github.io/NMdata/reference/NMcheckData.html

An example with a few arguments that activate additional checks:
res.checks <-
NMcheckData(mydata,covs="WEIGHTBL",cols.num="WEIGHT",col.usubjid="USUBJID")
Here, NMcheckData will (in addition to a bunch of other checks) see if WEIGHTBL 
exists numeric, non-na and unique within subjects, WEIGHT exists and is numeric 
and non-NA, and that ID is unique against USUBJID and vice versa. (Obviously, 
Nonmem can't read USUBJID if it contains characters, but you could still keep 
it to the right in the dataset for reference). See the manual above for more 
options.

Best,
Philip

On 2023-05-26 11:06 AM, Leonid Gibiansky wrote:
Yes, SIGMA should be fixed to 1 (do not try anything else, it has to
be done correctly in the code first, and then we should worry about
how to make it work)

For combined error, expression is
W = SQRT(W1**2 + W2**2) (squares in both terms)

Do not worry about error 134, this is harmless, and you can fix it any
time after you get your model right. Add UNCONDITIONAL MATRIX=S to the
$COV step.

For PARAMETER ESTIMATE IS NEAR ITS BOUNDARY try to add
NOSIGMABOUNDTEST NOOMEGABOUNDTEST NOTHETABOUNDTEST to $est record

Most of the time, numerical difficulties come from the problems with
the data, so it makes sense to clean the data set first as much as
possible.

Leonid


On 5/26/2023 9:30 AM, Hiba Sliem wrote:
Hi

I already tried fixing the value of sigma to 1, the covariance step
isn't implemented when I do that.
If I try fixing it to 0.144 the minimization isn't successful.

I also tried a combined error model like this:
LOQ=0.1
IPRED = F
W1 = THETA(8)*IPRED
W2 = THETA(9)
W = SQRT(W1**1 + W2**2)
DEL = 0
IF(W.EQ.0) DEL = 1
IRES = DV - IPRED
IWRES = IRES/(W + DEL)
DUM = (LOQ -IPRED)/(W + DEL)
CUMD = PHI(DUM)
IF(BLQ.EQ.0) THEN
F_FLAG=0
Y= IPRED +W*ERR(1)
ELSE
F_FLAG=1
Y=CUMD
MDVRES = 1
ENDIF

In which case I get a PARAMETER ESTIMATE IS NEAR ITS BOUNDARY error
message When trying to fix Sigma in the combined model I have a
MINIMIZATION TERMINATED
   DUE TO ROUNDING ERRORS (ERROR=134) message.

My dataset has a lot of predose samples and washouts between
different periods, is it possible the issue comes from my dataset?

Regards

-----Original Message-----
From: Leonid Gibiansky <lgibian...@quantpharm.com>
Sent: Friday, 26 May 2023 14:51
To: Hiba Sliem <hiba.sl...@pharmalex.com>; nmusers@globomaxnm.com
Subject: Re: [NMusers] Problem with estimating sigma when using M3
method

[You don't often get email from lgibian...@quantpharm.com. Learn why
this is important at https://aka.ms/LearnAboutSenderIdentification ]

you should fix

$SIGMA
1 FIX

as you are already estimating the SD using THETA(8).

Leonid

On 5/26/2023 4:57 AM, Hiba Sliem wrote:
Hello

I'm fairly new to nonmem, I'm currently trying to model a phase 1
study with BLQ values, while the run was successful with no error
message, my
residual error has a   %rse >70 and a confidence interval that
includes
zero.

Here's my code:

$ERROR

LOQ=0.1

IPRED = F

SD = THETA(8)*IPRED

DEL = 0

IF(SD.EQ.0) DEL = 1

IRES = DV - IPRED

IWRES = IRES / (SD + DEL)

DUM = (LOQ -IPRED) / (SD + DEL)

CUMD = PHI(DUM) + DEL

IF(BLQ.EQ.0) THEN

F_FLAG=0

Y= IPRED +SD*ERR(1)

ELSE

F_FLAG=1

Y=CUMD

MDVRES = 1

ENDIF

    $EST METHOD=1 INTERACTION LAPLACIAN PRINT=5 MAX=9999 SIG=3  SLOW
NUMERICAL   MSFO=*.msf

$SIGMA

     0.38 ;[P] sigma(1,1) (estimated in a previous model)

Furthermore, when trying to fit this model to my phase 2 dataset,
covariance step fails when I implement it.

Any suggestions are welcome

Thank you


Reply via email to