RSEED and evaluating saved designs

This forum is for posts that specifically focus on Ngene.

Moderators: Andrew Collins, Michiel Bliemer, johnr

RSEED and evaluating saved designs

Postby miq » Tue Dec 03, 2024 2:04 am

When creating complex designs, I typically run several instances of NGENE using different values for ;RDRAWS and ;REP. Despite using the same ;RSEED, the efficiency measure changes when I evaluate a previously saved design or use it as a starting value. Is this a bug, or am I overlooking something? I would appreciate any suggestions. Thank you very much in advance.

Here is example syntax I am using:

design

; alts(MNL) = alt1*, alt2*, alt3
; alts(MXL) = alt1*, alt2*, alt3

; rows = 20
; block = 5

; bseed = 179424673
; rseed = 179424673

?; bdraws = sobol(1000)
; rdraws = sobol(300)

; alg = swap(random = 100, swap = 1, swaponimprov = 20, reset = 10, resetinc = 10) ?, stop = noimprov(10000 iterations)

; rep = 300

; eff = 7.31101*MNL(mnl,d) + 1.41841758*MXL(rppanel,d)

; start = species - main 1 - sea birds.ngd

; con

; model(MNL):
U(alt1) = b_population.dummy[-0.3270|0.3386|0.2849]*population[0,2,3,1]
+ b_conservation_focus.dummy[-0.1235|-0.0292]*conservation_focus[0,2,1]
+ b_recreation_restrictions.dummy[0.0703|0.0794]*recreation_restrictions[1,2,0]
+ b_cost[-0.002101]*cost[5,10,25,50,100,150,250,500] /
U(alt2) = b_population.dummy*population
+ b_conservation_focus.dummy*conservation_focus
+ b_recreation_restrictions.dummy*recreation_restrictions
+ b_cost*cost /
U(alt3) = b_sq[-0.9669]

; model(MXL):
U(alt1) = b_population.dummy[n,-0.5173,1.0012|n,0.3,0.6073|n,0.4,1.0488]*population[0,2,3,1]
+ b_conservation_focus.dummy[n,-0.1677,0.4624|n,-0.0957,0.4822]*conservation_focus[0,2,1]
+ b_recreation_restrictions.dummy[n,0.0852,0.9283|n,0.0929,1.3373]*recreation_restrictions[1,2,0]
+ b_cost[n,-0.003906,0.006927]*cost[5,10,25,50,100,150,250,500] /
U(alt2) = b_population.dummy*population
+ b_conservation_focus.dummy*conservation_focus
+ b_recreation_restrictions.dummy*recreation_restrictions
+ b_cost*cost /
U(alt3) = b_sq[n,-2.5542,2.4839]

$
miq
 
Posts: 23
Joined: Thu Mar 26, 2009 6:13 am

Re: RSEED and evaluating saved designs

Postby Michiel Bliemer » Tue Dec 03, 2024 9:09 am

There are potentially three processes that are randomised:
* Draws for random parameters, which can be set via ;rdraws and can be fixed using ;rseed
* Draws for Bayesian priors, which can be set via ;bdraws and can fixed using ;bseed
* Generation of a random sample (for panel models only), which can be set via ;rep

So it is ;rep that causes the variation in outcomes, as in each run there a random sample of virtual respondents is generated as required for evaluating panel mixed logit models. If you would evaluate rp instead of rppanel you would see that the outcome remains the same, but of course rppanel is the appropriate model and I would not change it to rp. It is currently not possible to fix the random sample generation, but I can see that it would be useful to do so. I will put it on the list for future development, thanks for letting us know.

Michiel
Michiel Bliemer
 
Posts: 1901
Joined: Tue Mar 31, 2009 4:13 pm


Return to Choice experiments - Ngene

Who is online

Users browsing this forum: No registered users and 5 guests

cron