Page 1 of 1

Question on adding contraint

PostPosted: Wed Apr 22, 2020 1:40 pm
by xinyi
To provide background on the attributes/levels:
6 attributes, all dummy coded.

1. Overall survival (OS): [2]22 months, [1]16 months, [0]12 months
2. Time to progression (TP): [2] 20 months, [1] 14 months, [0] 10 months
3. Pain: [2] 100% decrease in pain, [1] 50% decrease in pain, [0] 25% decrease in pain
4. Fatigue: [2] increase from mild to severe, [1] increase from mild to moderate, [3] remains at mild
5. Nausea: [3] severe, [2] moderate, [3]mild, [0]none
6. Administration (Admin): [2] oral, [1] Subcutaneous, [0] IV

I want to create a D-efficient design and I have no priors based on literature so I’m using directional priors (setting them close to zero and providing some information on the direction). Assumptions: first 3 attributes has a positive impact on utilities whereas fatigue and nauseas have a negative impact.

Also, I want to remove dominant alternative and add a constraint such that within each alternative, the OS has to be greater than the time to progression i.e. we cannot have an OS that is 12 months but with a time to progression of 20 months.

My questions are:
1) How do I add in the constraint to prevent an alternative from having an OS<TP? I tried using the reject statement (see code below)but I don’t think I did it correctly..
2) If I instead specify overall survival and time to progression as continuous variable, but eventually analyze them as dummy variables, is that okay? Or do I have to analyze them as continuous variables if I
design them as such.
3) Is 0.002, 0.001 too small? Or should I use 0.02, 0.01 etc as directional priors?


Design
;alts = alt1*, alt2*
;rows=52
;block=4
;eff = (mnl,d)
;alg = mfederov
;reject:
alt1.OS<alt1.TP and alt2.OS<alt2.TP
;model:
U(alt1) = b1.dummy [0.002|0.001]*OS [2,1,0] +
b2.dummy [0.002|0.001]*TP [2,1,0] +
b3.dummy [0.002|0.001]*PAIN [2,1,0] +
b4.dummy [-0.002|-0.001]*FATI [2,1,0] +
b5.dummy [-0.003|-0.002|-0.001]*NAU [3,2,1,0] +
b6.dummy [0|0]*ADMIN [2,1,0] /

U(alt2) = b1.dummy*OS +
b2.dummy*TP +
b3.dummy*PAIN +
b4.dummy*FATI +
b5.dummy*NAU +
b6. dummy *ADMIN
$

Re: Question on adding contraint

PostPosted: Wed Apr 22, 2020 2:16 pm
by Michiel Bliemer
See syntax below. I would recommend keeping all variables dummy coded, which still allows you to assume a continuous variable and estimate a linear effect. The other way around cannot be guaranteed, especially since the modified Federov algorithm does not satisfy attribute level balance and some levels may not appear in the survey if you do not use dummy coding. Using 0.001 is small enough for dummy coded variables (which are 0 or 1).

Note that I have used actual months as levels for OS and TP, otherwise the constraint does not make sense.

Code: Select all
Design
;alts = alt1*, alt2*
;rows = 52
;block = 4
;eff = (mnl,d)
;alg = mfederov(candidates = 5000)
;reject:
alt1.OS < alt1.TP,
alt2.OS < alt2.TP
;model:
U(alt1) = b1.dummy[0.002|0.001] * OS[22,16,12]            ? overall survival (months)
        + b2.dummy[0.002|0.001] * TP[20,14,10]            ? time to progression (months)
        + b3.dummy[0.002|0.001] * PAIN[100,50,25]         ? decrease in pain (%)
        + b4.dummy[-0.002|-0.001] * FATI[2,1,0]           ? fatigue
        + b5.dummy[-0.003|-0.002|-0.001] * NAU[3,2,1,0]   ? nausea
        + b6.dummy[0|0] * ADMIN[2,1,0]                    ? administration, 0 = IV, 1 = subcutaneous, 2 = oral
        /
U(alt2) = b1 * OS
        + b2 * TP
        + b3 * PAIN
        + b4 * FATI
        + b5 * NAU
        + b6 * ADMIN
$


Michiel

Re: Question on adding contraint

PostPosted: Fri Apr 24, 2020 4:32 am
by xinyi
Hi Michiel,

Thank you for the feedback, this is really helpful! I do have a quick follow-up question (which may be a bit basic but I wanted to be sure). In this code, does that mean that OS of 22 will contribute 0.044 increase in utility in Alt1? Similar, that would mean that a 100% reduction in pain would contribute to a 0.2 increase in utility – I’m just wondering if the latter is too high considering the relatively small impact of the other attributes on utilities (much closer to 0). Or is this really not a big deal since 0.2 is still considered small anyway?

Thanks!
Xinyi

Re: Question on adding contraint

PostPosted: Fri Apr 24, 2020 9:38 am
by Michiel Bliemer
You need to distinguish between attribute levels shown to respondents (which for OS is 12, 16, or 22 months) and how it is coded in your model (which is 0 or 1 for dummy coding). If OS = 12, then contribution to utility is 0 (because this last level is the base level). If OS = 16 then contribution to utility is 0.001, and if OS = 22 then contribution to utility is 0.002.

Michiel

Re: Question on adding contraint

PostPosted: Fri Apr 24, 2020 12:44 pm
by xinyi
Thanks, okay I think I get it now. The level on the furthest right is always the reference level. And unlike a continuous attribute, you don't actually multiply the value of the level to the prior.
Thanks for the clarification!

Re: Question on adding contraint

PostPosted: Fri Apr 24, 2020 1:20 pm
by Michiel Bliemer
Exactly!

Re: Question on adding contraint

PostPosted: Sat Apr 25, 2020 2:09 am
by xinyi
Hi Michiel,

Apologies but I have one more question regarding this design. I realized that with the imposed constraints for the levels for overall survival (OS - 12,16,22 ) and time to progression (TP - 10, 14, 20), TP-20 can only appear with OS-22 and OS-12 can only appear with TP-10. Therefore, this results in the overall attributes levels being quite imbalanced - which I know is expected because of the constraints. But I'm wondering, how will that impact the subsequent model estimation? I'm assuming the SEs for OS-12 and TP-20 will be a lot larger given that they are shown comparatively less frequently than the other levels. How important is attribute balance in this context?

Thanks!

Re: Question on adding contraint

PostPosted: Sun Apr 26, 2020 11:20 am
by Michiel Bliemer
Think of the following. If a specific level does not appear at all in the design, then the associated dummy coded coefficient could not be estimated and the D-error would be infinite. If a specific level only appears once, then the associated dummy coded coefficient will have a large standard error and the D-error would be relatively large. When you generate an efficient design, by minimising the D-error you are essentially minising the standard deviations and as such all levels will appear such that maximising information is captured. Without constraints, this usually leads to a high level of attribute level balance across dummy coded variables. However, due to constraints there will be less balance, but still the minisation of the D-error means that all levels are represented such that the standard errors of all dummy coefficients are as small as possible. If you have very strict constraints, then this may indeed mean that some standard errors may be large, and then the only solution would be to relax the constraints if possible. If this is not possible, then the efficient design will be the best you can do.

Michiel

Re: Question on adding contraint

PostPosted: Mon Apr 27, 2020 9:02 am
by xinyi
Thanks for the explanation! I guess at the end of the day, many factors play into the design and we have to balance what we want to assess (and the additions of constraints for logical assessments) with how we precise we want the estimation to be.