INTRODUCTION

Impairments in different forms of decision making have been observed in a number of neurological disorders where perturbations in dopamine (DA) activity are thought to contribute to their underlying pathophysiology. These include schizophrenia (Shurman et al, 2005; Heerey et al, 2007), stimulant addiction (Rogers et al, 1999; Hoffman et al, 2006), and depression (Murphy et al, 2001). Accordingly, there has been a growing interest in the neurobiological basis underlying different forms of decision making in experimental animals, with particular emphasis placed on the role of DA in mediating these types of executive functions. In these studies, animals evaluate the cost of competing response options relative to the potential reward that may be obtained. For example, ‘delay-discounting’ tasks are used as a measure of impulsive decision making, where response costs are varied by imposing a delay before delivery of a larger reward vs acquiring an immediate, smaller reward. Alternatively, response costs can be varied by increasing the effort required to receive a larger reward. In ‘effort-based’ decision making, animals choose either a small reward obtainable after a nominal amount of physical effort, or obtaining a larger reward after considerably more work (eg climbing a scalable barrier, pressing a lever multiple times).

Both of these forms of decision making are exquisitely sensitive to manipulations of DA transmission. Administration of DA receptor antagonists induces impulsive choice in rats, reducing the preference for larger, delayed rewards (Cardinal et al, 2000; Wade et al, 2000; Denk et al, 2005; Van Gaalen et al, 2006). Conversely, increasing DA transmission with psychostimulants (eg amphetamine, methylphenidate, nomifensine) has the opposite effect, making animals more tolerant of delays imposed before delivery of larger rewards (Evenden and Ryan, 1996; Cardinal et al, 2000; Wade et al, 2000; Van Gaalen et al, 2006). In a similar vein, DA receptor blockade decreases the tendency of rats to work harder to obtain a larger reward in a number of different choice paradigms. These include studies using a T-maze task where rats must climb a barrier in one arm to obtain a larger reward (Salamone et al, 1994; Denk et al, 2005), or a concurrent choice procedure conducted in an operant chamber in which a rat either responds on a lever at a high ratio to receive preferred reward pellets or merely consumes a less preferred lab chow that is freely available (Cousins et al, 1994). Considerably less is known about how increasing DA transmission may affect this form of decision making. In one study, amphetamine (1–3 mg/kg) decreased in both lever pressing and chow consumption using a concurrent lever pressing/feeding choice procedure (Cousins et al, 1994), although these doses were substantially higher than those typically used to modulate delay-based decision making (Wade et al, 2000; Van Gaalen et al, 2006). In contrast, DA transporter knockdown mice that have chronically elevated levels of extracellular DA displayed an increased preference for a more preferred food using a similar procedure (Cagniard et al, 2006). Thus, lower doses of psychostimulants increase the preference for larger, delayed rewards, whereas the effect of increasing DA transmission on effort-based decision making is unclear.

Although both delay- and effort-based decision making are altered by systemic DA manipulations, the specific terminal regions where these agents act to affect these two types of processes may differ. Both types of choice are altered in a similar manner by lesions of the basolateral amygdala (Winstanley et al, 2004; Floresco and Ghods-Sharifi, 2007). However, DA depletion in either the nucleus accumbens or anterior cingulate region of the medial prefrontal cortex (PFC) reduces the preference to exert more effort to obtain a larger reward (Cousins et al, 1996; Sokolowski and Salamone, 1998; Schweimer et al, 2005). In contrast, neither DA lesions of the accumbens, nor excitotoxic lesions of medial PFC regions alter delay discounting (Cardinal et al, 2001; Winstanley et al, 2005; Rudebeck et al, 2006). Rather, DA activity in the orbital PFC may be of primary importance in mediating choices between immediate and delayed rewards (Kheramin et al, 2004; Winstanley et al, 2004, 2006). Thus, comparisons between these two forms of decision making must take into account the overlap and segregation of the neural circuits that mediate both types of decisions.

Another consideration when comparing these two types of decision-making processes is that animals typically incur a delay to reinforcement when they are required to exert more effort to obtain a larger reward. Therefore, it is difficult to ascertain whether the role of DA in effort-based decision making is specifically related to evaluations comparing physical response costs relative to reward magnitude, or merely modulating the impact that delays to reinforcement exerted over choice behavior. To clarify these issues, we developed a novel ‘effort-discounting’ procedure conducted in an operant chamber, permitting us to parse out the effects of dopaminergic manipulations on effort- and delay-based decision making. The effects the DA antagonist flupenthixol and the DA releaser D-amphetamine on effort and delay discounting were assessed. More importantly, we also devised a separate effort-discounting procedure in which the delay to reinforcement was equalized between the more and less effortful response options. We also assessed the effects of the noncompetitive NMDA receptor antagonist ketamine, which exerts direct and indirect effects on mesolimbic DA transmission (Verma and Moghaddam, 1996; Kapur and Seeman, 2002) and induces psychotic symptoms and cognitive impairments resembling those observed in schizophrenia (Javitt, 2007). Given that schizophrenia is also associated with impaired decision making (Shurman et al, 2005; Heerey et al, 2007) it is somewhat surprising that the effects of ketamine on these types of executive functions have not been explored. Therefore, it was of particular interest to compare the effects of this compound with those that act directly on DA transmission.

MATERIALS AND METHODS

Animals

Three groups of eight, male Long-Evans rats (hereafter referred to as groups A, B, and C; Charles River Laboratories, Montreal, Canada) weighing 275–300 g at the start of behavioral training were used. Rats were individually housed in a colony room on a 12 : 12 h light-dark schedule, and given ad libitum access to water. Feeding occurred in their home cages at the end of the experimental day. Unless otherwise specified, rats were maintained at 85% of their free-feeding weight for the duration of testing. All testing was in accordance of the Canadian Council of Animal Care and the Animal Care Committee of the University of British Columbia.

Apparatus

Eight operant chambers (30.5 × 24 × 21 cm; Med-Associates, St Albans, VT, USA) enclosed in sound-attenuating boxes were used. Boxes were equipped with a fan to provide ventilation and to mask extraneous noise. Each chamber was fitted with two retractable levers, one located on each side of a central food receptacle where food reinforcement (45 mg; Bioserv, Frenchtown, NJ) was delivered by a pellet dispenser. Each chamber was illuminated by a single 100 mA houselight located in the top center of the wall opposite the levers. Four infrared photobeams were mounted on the sides of each chamber 3 cm above the grid floor, and another photobeam was located in the food receptacle. Locomotor activity was indexed by the number of photobeam breaks that occurred during a session. It has been our experience that with this positioning of the photobeams, beam breaks are not a reliable index of sniffing or other stereotypies that may be induced by stimulant drugs, but do reliably reflect ambulatory locomotion through the chambers. All experimental data were recorded by an IBM personal computer connected to the chambers via an interface.

Lever Pressing Training

Our initial training protocols were adapted from Cardinal et al (2000). On the day before initial exposure to the operant chamber, rats were given 20 reward pellets in their home cage. Before the animal was placed in the chamber on the first day of training, 2–3 crushed pellets were placed in the food cup and on the active lever. Rats were trained under a fixed-ratio 1 schedule to a criterion of 50 presses in 30 min, first for one lever, then the other (counterbalanced left/right between subjects). On subsequent days, they were trained on a simplified version of the full task. These sessions consisted of 90 training trials and began with the levers retracted and the chamber in darkness. Every 40 s, a trial began with illumination of the houselight and insertions of one of the two levers into the chamber. If the rat failed to respond on the lever within 10 s, the lever was retracted, the chamber darkened and the trial was scored as an omission. If the rat responded within 10 s, the lever retracted, a single pellet was delivered immediately and the houselight remained illuminated for another 4 s. In every pair of trials, the left or right lever was presented once, and the order within the pair of trials was random. Rats were trained to a criterion of 80 or more successful trials (ie 10 omissions), which took 5–6 days of training.

Decision-making Tasks

Rats in all three groups were trained initially on the effort-discounting task described below, receiving 5–6 daily training sessions per week. After initial drug challenges, rats in each group were trained in different behavioral protocols. The specific pharmacological and behavioral manipulations that rats in each group were subjected to are summarized in Table 1 and Figure 1c.

Table 1 Experiments Performed
Figure 1
figure 1

Schematic of the decision-making tasks used and task acquisition data. (a) The format of a single free-choice trial on the effort-discounting task. (b) Cost/benefit contingencies associate with responding on either the low-reward (LR) or high-reward (HR) lever on the effort discounting (left), delay discounting (middle), or effort discounting with equivalent delays (right) tasks. (c) Summary of the decision-making tasks and manipulations animals in each group were subjected to over the course of the experiment. (d) Mean (+SEM) proportion of choices of the HR lever across the four trial blocks on the effort-discounting task for all animals at different time points in training. (e) Mean (+SEM) latencies to complete 2, 5, 10, and 20 presses on the HR lever for all animals after 21–23 days of training on the effort-discounting task. (f) Mean (+SEM) proportion of choices of the HR lever for all animals trained on the effort discounting (squares) and subsequently on the effort discounting with equivalent delays (circles) tasks. Equalizing the delay to food delivery between response options increased the proportion of choices of the HR lever across all trial blocks. Star denotes significant difference between training conditions at P<0.05.

Effort discounting

The basic procedure used in this task is diagramed in Figure 1a and b (left). Animals received one daily 32 min session that consisted of 48 discrete choice trials, separated into four blocks. Each block of trials comprised of two forced-choice trials on which only one lever was presented (one trial for each lever, in randomized order) followed by 10 free-choice trials. A session began in darkness with the levers retracted (the intertrial state). Trials began at 40 s intervals with the illumination of the houselight, followed by extension of one or both levers 3 s later. One lever was designated as the high-reward (HR) lever, the other the low-reward (LR) lever (counterbalanced left/right between animals), which remained constant for the duration of the experiment. If a rat did not respond within 25 s of lever presentation (omission), the chamber was reset to the intertrial state. Responding on the LR lever caused both levers to be retracted and the immediate delivery of two pellets. However, the first response on the HR lever caused the immediate retraction of the LR lever, while the HR remained inserted in the chamber. To increase the effort requirement, rats had to complete a fixed ratio of presses on the HR lever to receive delivery of four reward pellets. Immediately after the last required lever press on the HR lever was made, the lever retracted and four pellets were delivered. Pellets were delivered 0.5 s apart. After food delivery, the houselights remained on for another 4 s, after which the chamber returned to the intertrial state.

The fixed ratio of lever presses required to obtain the HR was varied systematically across each block, starting at 2, then 5, 10, and 20 presses, respectively. If the rat chose the HR lever but did not complete the ratio within 25 s, the lever retracted, no food was delivered and the chamber reverted to the intertrial state, although the animal's choice was still incorporated into the data analysis. These types of incomplete trials were very rare, even on drug test days that increased the overall omission rate. On any given drug test day, rats averaged less than 1 such incomplete trial over the 40 free-choice trials, typically during the last trial block. The amount of time required by each rat to complete the fixed ratio of presses on the HR lever over each trial block was recorded.

As noted above, rats in all three groups were initially trained on the effort-discounting task. For this and all subsequent tasks, training continued until rats as a group (1) chose the HR lever during the first trial block (fixed-ratio 2) on at least 70% of successful trials, and (2) demonstrated stable baseline levels of choice. Stable baseline performance was assessed using a procedure similar to that described by Winstanley et al (2004). Data from three consecutive sessions were analyzed with a repeated-measures ANOVA with two within-subjects factors (Training Day and Trial Block). If the effect of Trial Block was significant at the P<0.05 level but there was no main effect of Training Day or Day × Trial Block interaction, animals were judged to have achieved stable baseline levels of performance. On the following day, the first drug tests were carried out. Daily training sessions continued on subsequent days until rats again demonstrated stable levels of choice, after which a second test day was carried out. This was repeated until rats in each group had received each of their designated drug treatments (see below).

Effort discounting with descending effort ratios

After initial training and drug tests using the effort- and delay-discounting tasks, rats in group A were then trained on a variant of the effort-discounting procedure. This task was similar to the effort discounting, except that the number of presses required to obtain the HR was 20 presses in the first block, and decreased over subsequent blocks.

Delay discounting

After initial training and drug tests using the effort-discounting tasks were completed, rats in all groups were then trained on a delay-discounting task (Cardinal et al, 2000; Winstanley et al, 2004). In this task, a single response on the HR lever delivered four pellets; however delivery of these four pellets occurred after a delay ranging from 0.5 to 8 s (Figure 1b, middle). A single response on the LR lever delivered two pellets immediately. The delay to food delivery reflected the average time rats in a respective group required to press the HR lever 2, 5, 10, and 20 times during the last 3 days of training on the effort-discounting task (excluding drug test days). For example, on average, rats in group C required 0.4, 1.7, 2.8, and 6.5 s to press the HR lever 2, 5, 10, and 20 times, respectively. These delays were increased over each of the four trials blocks. Rats were trained on this task until they demonstrated stable levels of choice, after which they received their first drug test day. Training continued until stable baseline levels of choice were reestablished, after which a second test day was administered and this procedure was repeated until rats in each group had received all of their designated drug treatments.

Effort discounting with equivalent delays

After training and drug tests using the delay-discounting procedure had been carried out, rats in groups B and C were then retrained on a modified version of the effort-discounting task. This task was identical to the effort-discounting procedure, except that after a single press on the LR lever, delivery of the two pellets was preceded by a delay (Figure 1b, right). Specifically, after responding on the LR lever, two pellets were delivered after a delay equivalent to that required for rats to complete the ratio of presses on the HR using the standard effort-discounting procedure (0.5–8 s). Thus, for each block of trials, the delay to food delivery after responding on either lever was equalized. As with the delay-discounting task, the delay to receive two pellets after a single press on the LR lever increased across trial blocks and was calculated based on the average time it took all rats in the respective groups to press the lever 2, 5, 10, and 20 times during the last 3 days of training on the effort-discounting task. Thus, if rats required 6.5 s to press the HR lever 20 times during the last trial block, a single press on the LR lever during this block would deliver two pellets after a 6.5 s delay. Drug tests and retraining were conducted in an identical manner to that described above.

Pharmacological and Satiety Manipulations

Each drug test consisted of a 2-day sequence, where rats received an intraperitoneal injection of vehicle on the first day and a drug injection on the subsequent day. All drugs were mixed in saline and injected at a volume of 1 ml/kg. Drug tests were separated by at least 4 days. We tested the effects of three drugs using the effort-discounting procedure, with one drug being assigned to a particular group of rats. Rats in each group received three doses of one of the three drugs prior to a training session using the standard effort-discounting procedure. Specifically, rats in group A received the DA antagonist flupenthixol (0.125, 0.25, and 0.5 mg/kg, Sigma-Aldrich, Oakville, ON, Canada), group B received D-amphetamine (0.125, 0.25, and 0.5 mg/kg, Sigma-Aldrich), and group C received the noncompetitive NMDA receptor antagonist ketamine (1, 5, 10 mg/kg, Bimeda-MTC, Cambridge, ON, Canada). The order in which the drug doses were administered is summarized in Table 1 and Figure 1c. For the stimulant drugs (D-amphetamine and ketamine), we were uncertain what doses would be required to exert an effect on choice, making randomization of drug dosing problematic. Thus, our initial challenges used the highest doses listed here, and we used progressively lower doses until no effect was observed. After initial drug tests using the effort-discounting procedure were completed, we tested the effect of these compounds on the different variants of the task described above. Here, we only used the lowest drug doses that caused a significant alteration in choice behavior on the standard effort-discounting procedure.

At the completion of all drug testing, we assessed the effects of acute and long-term free feeding on the choice behavior of rats in group C. Rats in this group were retrained on the standard effort-discounting task until they displayed stable baseline levels of performance. They were then given ad libitum access to standard lab chow in their home cages for the duration of testing, which lasted 6 days.

Data Analysis

Preliminary analyses revealed that there were no significant changes in baseline levels of choice behavior across saline test days within any experimental series (ie effort discounting, delay discounting, etc; all F's <1.65, NS), indicating that on each of these tasks, performance remained stable over several weeks. Thus, for each experiment that incorporated multiple saline and drug tests, the data from all saline test days were averaged and used in the analysis. Our primary dependent measure of interest was the proportion of choices of the HR lever, factoring trial omissions. For each block, this was calculated by dividing the number of choices of the HR lever by the total number of successful trials. For each series of drug tests, these data were analyzed with separate two-way, repeated-measures ANOVAs with Drug Dose and Trial Block as two within-subjects factors. The latency to complete the fixed ratio of presses on the HR lever over trial blocks was analyzed in a similar manner. The number of trial omissions and locomotor activity data (ie photobeam breaks) were analyzed with one-way repeated-measures ANOVAs. Missing values were replaced with the group mean. Multiple comparisons were conducted using Dunnett's tests.

RESULTS

Acquisition of Sensitivity to Effort Requirements

Initial training

Rats in all three groups demonstrated sensitivity to increasing effort requirements and stable baseline levels of choice behavior after an average of 24±4 days of training on the effort-discounting task (group A=18, B=32, C=21 days). Changes in the preference of the HR lever that occurred over training for all 24 rats are shown in Figure 1d. During the first 3 days of training, rats did not show a preference for either lever across all four trial blocks. However, by training sessions 10–12, a slight preference for the HR lever emerged during the first trial block, when the effort requirement was low (two presses). Moreover, rats began to display a discounting curve over the course of the session, choosing the HR lever less as the effort requirement increased. By sessions 21–23, rats were choosing the HR lever on approximately 75% of trials during the first block, and displayed a steeper discounting curve over the rest of the session. Figure 1e displays the average time required by rats to complete the fixed ratio of presses on the HR lever as the effort requirement increased across blocks. After 21–23 days of training, rats were responding at a rate of approximately two presses per second. Obviously, as the effort requirement increased, rats required more time to complete the fixed ratio of presses. Thus, particularly in the latter block of trials where the effort requirement was 10–20 presses, rats incurred a considerable delay from the point where they initially selected the HR lever and the time that the four pellets were delivered. As training continued, the latency to complete the ratio of presses at these latter blocks decreased slightly, presumably because rats had acquired more experience with responding at high ratios.

Effect of delays to reinforcer delivery on effort discounting

Rats in groups B and C were trained on both the effort-discounting task and the effort discounting with equivalent delays procedure, where a single press on the LR lever delivered two pellets after a delay equivalent to the time to complete the fixed ratio of presses on the HR lever. We were particularly interested in assessing how equalizing the relative delay to reward delivery affected choice behavior. Therefore, we compared choice data taken from the last 3 days of training on effort-discounting procedure to those from days 9–11 of training on the effort discounting with equivalent delays procedure. As can be observed in Figure 1f, when delays to receiving either the LR or the HR were equalized, rats chose the HR lever significantly more across all trial blocks, when compared to their preference for the HR lever on the standard effort-discounting task (F(1, 15)=37.24, P<0.001). Thus, delays to reinforcement incurred when animals must respond on a lever multiple times influence the preference for larger rewards that come with a greater response cost. This being the case, it suggests that manipulations affecting choice behavior on both variants of the effort-discounting task likely act on processes related to effort-based, rather than delay-based decision making. In contrast, alterations in effort and delay discounting combined with a lack of an effect on the effort discounting with equivalent delays task would be indicative of selective effect on processes related to delay-based decision making.

DA Receptor Blockade with Flupenthixol

Effort discounting

After 18 days of training on the effort-discounting tasks, rats in group A displayed stable baseline levels of choice for 3 consecutive days. At this point in training, rats were responding reliably, and did not make any trial omissions. On separate days, rats in this group received three doses of the flupenthixol in the following order; 0.25, 0.50, and 0.125 mg/kg. Analysis of the choice data revealed a significant main effect of Dose (F(3, 21)=3.46, P<0.05), a significant main effect of Trial Block (F(3, 21)=13.26, P<0.001), but no significant Dose × Block interaction (F(9, 63)=1.43, NS). As can be observed in Figure 2a, both the 0.25 and 0.50 mg/kg doses, but not the 0.125 mg/kg dose of flupenthixol caused a significant (P<0.05) reduction in number of choices directed toward the HR lever on free-choice trials, when compared to saline test days. This effect was most prominent during the third trial block, when the effort requirement on the HR lever was 10 presses. However, analysis of the latency to complete the fixed ratio of presses on the HR lever yielded no significant main effect of Dose (F(3, 21)=2.17, NS), or Dose × Block interaction (F(9, 63)=1.25, NS). This indicates that the effect of flupenthixol on decision making cannot be attributed to decreases in the rate of responding on the HR lever. However, it is possible that flupenthixol may have altered the pattern of responding without altering the overall response rates, as has been reported following DA depletion in the nucleus accumbens (Mingote et al, 2005). Nevertheless, although DA receptor blockade caused rats to choose the HR lever less often, when they did choose the HR lever, their rates of responding were similar to those observed following saline injections. The highest dose of flupenthixol did cause a slight increase in trial omissions (5.0±3), primarily attributable to two rats, but there were no statistically significant differences between treatment conditions on this measure (F(3, 21)=1.74, NS). All three doses of flupenthixol decreased locomotor counts (F(3, 21)=4.83, P<0.05). Yet, the fact that the 0.125 mg/kg dose reduced locomotor activity but did not affect choice indicates that the effects of DA receptor blockade on decision making are independent of its effects on ambulatory locomotion.

Figure 2
figure 2

The effects of DA receptor blockade with flupenthixol on effort- and delay-based decision making. Symbols represent mean+SEM. Stars denote significant (P<0.05) differences vs saline (sal) across all trial blocks. (a) For rats in group A, flupenthixol dose dependently decreased the proportion of choices of the high-reward (HR) lever across all trial blocks using the effort-discounting procedure. (b) The 0.25 mg/kg dose also decreased the proportion of choices of the HR lever when the effort requirement on the HR lever decreased over the course of training sessions. (c) For rats in group C, flupenthixol reduced the preference for larger, delayed rewards on the delay-discounting task. Numbers on the abscissa represent the delay to delivery of the HR for each trial block, and were calculated from the average time rats took to press the HR lever 2, 5, 10, and 20 times using the effort-discounting procedure. (d) DA receptor blockade reduced the preference for the HR lever on the effort discounting with equivalent delays task. Here a single press on the low-reward (LR) lever delivered two pellets after a delay equivalent to the time required to complete the ratio of presses on the HR lever. Numbers on the abscissa denote the effort requirement on the HR lever (top) and the delay to delivery of the two pellets after a single press on the LR lever (bottom). (e) Flupenthixol increased the number of trial omissions on the effort discounting with equivalent delays task, with this effect being more prominent in the latter parts of the session. (f) Replication of the effects of a 0.25 mg/kg challenge on effort discounting in a separate group of rats that had received extended training (group C).

After receiving the three doses of flupenthixol, rats in this group were then trained on the delay-discounting task (see below) and then retrained for 10 days on a variant of the effort-discounting task, where the effort requirement on the HR lever decreased across the four trial blocks. Under these conditions, the 0.25 mg/kg dose of flupenthixol again decreased the preference for the HR (F(1, 7)=9.46, P<0.05; Figure 2b). Thus, as has been reported using other types of effort-based decision-making tasks, blockade of DA receptors increases effort discounting, reducing the preference to exert greater effort to obtain a larger reward (Salamone et al, 1991, 1994; Denk et al, 2005). These effects cannot be attributed to a psychomotor slowing effect and are independent of whether the effort requirement increases or decreases over time.

Delay discounting

After rats in group A had received their last flupenthixol drug test using the standard effort-discounting procedure, they were trained on the delay-discounting task for 12 days. Here, a single press on the HR lever delivered four pellets after a delay equivalent to average latency to complete the fixed ratio of presses during the last 3 drug-free days of effort-discounting training (0.4, 1.7, 3.8, and 8.7 s, respectively). By the end of this training period, the group displayed stable levels of choice behavior. When challenged with the 0.5 mg/kg dose of flupenthixol, rats in this group displayed an increase in impulsive responding, showing a reduced tendency to choose the HR lever. However, this dose also caused a substantial increase in trial omissions in five of the eight animals tested (28±6), even though this dose did not have the same magnitude of effect on omissions during an effort-discounting session when administered earlier in training. Each of these five rats made 10 omissions over at least one trial block (ie they did not respond on either lever), making analysis of the choice data problematic.

We subsequently tested the effect of the lower (0.25 mg/kg) dose of flupenthixol on delay discounting on a separate group of rats (group C; Table 1). This dose also caused a significant increase in trial omissions (12.0±4) relative to saline (1.7±1; F(1, 7)=7.14, P<0.05), but this effect was not as severe as that observed with the 0.5 mg/kg dose, permitting an analysis of the choice data. This analysis revealed significant main effects of Dose (F(1, 7)=11.66, P<0.05) and Trial Block (F(3, 21)=6.71, P<0.005), but the Dose × Trial Block interaction only approached significance (F(3, 21)=2.60, P=0.079). As can be observed in Figure 2c, this dose induced impulsive choice, reducing the preference for a larger reward delivered after a delay. This effect was most prominent during the last trial block when the delay to reinforcement delivery was 6.5 s. This dose also decreased locomotor counts (F(1, 7)=6.81, P<0.05).

Effort discounting with equivalent delays

The above-mentioned findings indicate that DA receptor antagonism reduces the preference to work harder or wait longer to obtain a larger reward, as has been reported previously (Cousins et al, 1994; Wade et al, 2000; Denk et al, 2005; Van Gaalen et al, 2006). It is important to note that in the effort-discounting task, animals incurred a delay to delivery of four pellets from the time rats initially chose the HR lever to when they completed the fixed ratio of presses. It was therefore important to determine whether the effects of DA receptor blockade on effort discounting were attributable specifically to alterations in decision-making processes related to the relative effort requirements associated with a choice or to a reduced tolerance for delays to reinforcement. Thus, after receiving their last drug test using the delay-discounting procedure, rats in group C were trained on the effort discounting with equivalent delays task. Here one press on the LR lever delivered two pellets after a delay that was equivalent to the average amount of time rats took to complete the fixed ratio of presses on the HR lever. Rats required 11 days of training on this task before displaying stable baseline levels of performance and receiving their first drug tests. As was observed using the standard effort-discounting procedure, flupenthixol (0.25 mg/kg) caused a significant decrease in the proportion of choices of the HR lever across all trial blocks when the delay to reinforcer delivery was equalized between response options (F(1, 7)=12.88, P<0.01; Figure 2d). However, this effect was not accompanied by reduced locomotor activity or increased latencies to compete the fixed ratio of presses on the HR lever (all F's<1.3, NS), although this dose did increase trial omissions (F(1, 7)=6.31, P<0.05). Interestingly, the majority of these omissions occurred during the last trial block, when the effort requirement on the HR lever was highest (Figure 2e). Taken together, these findings indicate that the effects of DA receptor blockade on effort discounting are independent of the effects of these drugs on delay discounting, because in this experiment, animals incurred a comparable delay before receiving either the larger or smaller reward. Thus, DA mediates both effort- and delay-based decisions, but may do so via dissociable mechanisms.

After the completion of drug tests using the effort discounting with equivalent delays procedure, rats in group C received an additional 6 days of training using the standard effort-discounting task, where one press on the LR delivered two pellets immediately. They then received another flupenthixol drug test (0.25 mg/kg). On this drug challenge, two rats made >40 trial omissions, so we only analyzed data from the remaining six rats (6.6±2 omissions). By this point, rats in this group had received nearly 100 days of training on three different variants of the task. Despite this extended training, administration of flupenthixol again reduced the number of choices of the HR lever, replicating the effect observed in group A (F(1, 5)=26.82, P<0.005; Figure 2f). Although there was no significant Dose × Block interaction (F(3, 15)=1.12, NS), the reduction in the preference for the HR only became apparent during the second and third trial blocks, indicating that flupenthixol did not decrease the sensitivity to the ratio of available rewards, as has been reported with other DA antagonists (Denk et al, 2005). Thus, the effects of DA receptor antagonism on effort discounting remain stable over time, even after extended training and experience with different types of discounting task.

Increasing DA Transmission with D-Amphetamine

Effort discounting

Rats in group B were trained on the effort-discounting procedure for 31 days before receiving three doses of D-amphetamine on separate test days in the following order; 0.50, 0.25, and 0.125 mg/kg. Rats were responding reliably at this point in training, and did not make any trial omissions. Analysis of the choice data on drug test days revealed a significant main effect of Dose (F(3, 21)=5.13, P<0.01), Trial Block (F(3, 21)=23.491, P<0.001), and, notably, significant Dose × Block interaction (F(9, 63)=2.29, P<0.05; Figure 3a). This interaction was attributable to the fact that D-amphetamine exerted dose-dependent, biphasic effects on choice behavior. Specifically, injections of 0.50 mg/kg dose of amphetamine did not alter choice during the first trial block when the effort requirement was low, but significantly (P<0.05) reduced the preference for the HR lever in all subsequent trial blocks, as the effort requirement increased. In contrast, administration of either 0.125 or 0.25 mg/kg of amphetamine actually increased the proportion of choices of the HR lever in the second and third trial blocks, where the effort requirement was 5 and 10 presses, respectively. These differential effects of amphetamine on decision making were not accompanied by changes in the rate of responding on the HR lever. Analysis of the latency to complete the ratio of presses on the HR lever yielded no significant main effect of Dose or Dose × Block interaction (both F's<1.47, NS). Similarly, none of the three doses increased trial omissions (F(3, 21)=1.3, NS). Predictably, locomotor counts were increased by the 0.25 mg/kg (1620±203) and 0.5 mg/kg (1661±170) doses of amphetamine relative to saline (1028±79; F(3, 21)=11.03, P<0.001 and Dunnett's, P<0.05). It is notable that both the 0.25 and 0.50 mg/kg doses of amphetamine increased locomotor activity to a similar degree; however these doses exerted opposite effects on decision making. This suggests that the effects of amphetamine on effort-based decision making are independent of its psychomotor stimulant actions. Thus, a 0.50 mg/kg dose of amphetamine increases effort discounting, reducing the preference to exert a greater amount of effort to receive a larger reward. In contrast, lower doses exert the opposite effect, increasing the tendency for rats to work harder for a larger reward.

Figure 3
figure 3

The effects of amphetamine on effort- and delay-based decision making. Symbols represent mean+SEM. *Stars denote significant (P<0.05) differences vs saline (sal) at a specific block (Treatment × Block interaction). All other conventions are the same as Figure 2. (a) A higher dose of amphetamine (0.50 mg/kg) decreased the proportion of choices of the high-reward (HR) lever on the effort-discounting task, whereas lower doses (0.25 and 0.125 mg/kg) had the opposite effect. (b) The 0.25 mg/kg dose shifted choice toward the HR lever on the delay-discounting task, but the 0.50 mg/kg dose had no effect. (c) Using the effort discounting with equivalent delays procedure, the 0.50 mg/kg dose again reduced the preference for the HR lever. However, under these conditions, the 0.25 mg/kg dose failed to affect choice.

Delay discounting

After rats in group B had received their last amphetamine drug test, they were trained on the delay-discounting task for 5 days. The average latency to complete the fixed ratio of presses during the last 3 drug-free days of effort-discounting training was 0.3, 1.2, 3.2, and 7.9 s, respectively. These delays were implemented as the delays to delivery of four pellets after a single press on the HR. By the end of this training period, the group displayed stable levels of choice behavior. We assessed the effects of both the 0.50 and 0.25 mg/kg doses of amphetamine, as each of these doses induce statistically significant, albeit opposing changes in choice using the effort-discounting task. These data are presented in Figure 3b. The analysis revealed a significant main effect of Dose (F(2, 14)=5.91, P<0.05), and also a significant main effect of Trial Block (F(3, 21)=6.50, P<0.005), indicating that as the delay to food delivery increased, rats chose the HR lever less often. There was no significant Dose × Block interaction (F(6, 42)=0.82, NS). Multiple comparisons indicated that the 0.25 mg/kg dose significantly (P<0.05) increased the tendency for rats to choose the larger, delayed reward, relative to saline treatments. However, the 0.5 mg/kg dose did not alter decision making on this task. There were no trial omissions on either saline or amphetamine test days. Both doses of amphetamine significantly increased locomotor activity (F(2, 14)=15.63, P<0.001). Thus, as has been reported previously (Cardinal et al, 2000; Van Gaalen et al, 2006), lower doses of amphetamine reduce impulsive responding using a delay-discounting procedure, whereby rats are more likely to choose a larger, delayed reward. However, in this instance, a higher dose (0.50 mg/kg) that significantly reduced the preference to work harder for a larger reward during the effort-discounting task did not alter choice behavior using a delay-discounting procedure.

Effort discounting with equivalent delays

Rats in group B were subsequently trained for 5 days on the variant of the effort-discounting task where a single press on the LR lever delivered two pellets after a delay equivalent to the average latency to complete the fixed ratio of presses on the HR lever. We again tested the effects of both the 0.25 and 0.5 mg/kg doses of amphetamine on this form of effort discounting. Analysis of the choice data again yielded significant main effects of Dose (F(2, 14)=7.00, P<0.01), Block (F(3, 21)=12.891, P<0.001), and Dose × Block interactions (F(6, 42)=5.28, P<0.001). As was observed using the standard effort-discounting procedure, 0.50 mg/kg of amphetamine again reduced the preference for the HR lever, although this effect was statistically significant only during the last trial block (fixed ratio 20; Figure 3c). In contrast, using this variant of the task, the 0.25 mg/kg dose did not alter choice behavior, even though this dose increased preference for the HR lever in the standard effort-discounting task. Analysis of the latency data revealed a significant Dose × Block interaction (F(6, 42)=4.40, P<0.05). This interaction was attributable to the fact that during the last trial block rats displayed higher rates of responding on the HR lever after receiving the 0.25 mg/kg dose (6.4±0.6 s to complete 20 presses) compared to saline (7.2±0.7 s) or the 0.5 mg/kg dose (9.1±1.4 s). Again, both doses significantly increased locomotor counts (F(2, 14)=30.21, P<0.001), but did not alter the number of trial omissions (F(2, 14)=1.14, NS). Thus, increased effort discounting induced by a higher dose of amphetamine cannot be attributed to a reduced tolerance for delayed rewards, given that this dose did not alter delay discounting but did alter effort discounting when animals incurred similar delays before receiving either a larger or smaller reward. Conversely, the tendency for rats to work harder for a larger reward induced by lower doses of amphetamine is likely mediated through processes related to delay-based decision making.

Noncompetitive NMDA Receptor Antagonism with Ketamine

Effort discounting

Rats in group C were trained on the effort-discounting task for 21 days before receiving three doses of ketamine on separate test days in the following order; 10, 5, and 1 mg/kg. Rats in this group exhibited a slightly higher rate of trial omissions compared to groups A and B (1.5±1). Analysis of the choice data revealed a significant main effect of Dose (F(3, 21)=8.92, P<0.001), and Trial Block (F(3, 21)=12.591, P<0.001), but no significant Dose × Block interaction (F(9, 63)=0.69, NS). As shown in Figure 4a, both the 5 and 10 mg/kg doses, but not the 1 mg/kg dose of ketamine significantly (P<0.05) reduced the proportion of choices of the HR lever across all trial blocks. This was apparent during the first trial block (fixed ratio 2) and continued to decrease across the session as the effort requirement on the HR lever increased. None of these doses affected the number of trial omissions (F(3, 21)=1.30, NS). The 10 mg/kg, but not the 5 or 1 mg/kg dose increased the latency to complete the ratio of presses on the HR lever across all trial blocks (F(3, 21)=7.76, P<0.001 and Dunnett's, P<0.01). All three doses reduced locomotor counts (F(3, 21)=9.84, P<0.001). Thus, noncompetitive blockade of NMDA channels alters effort-based decision making, reducing the tendency for rats to choose a more effortful response to obtain a larger reward.

Figure 4
figure 4

The effects of ketamine on effort- and delay-based decision making. All conventions are the same as Figure 2. (a) Ketamine dose dependently decreased the proportion of choices of the high-reward (HR) lever on the effort-discounting task. (b) The 5 mg/kg dose also induced impulsive choice on the delay-discounting task, reducing the preference for larger, delayed rewards. (c) Using the effort discounting with equivalent delays procedure, ketamine no longer affected choice.

Delay discounting

Rats in group C were trained subsequently on the delay-discounting task for 13 days. For this group, the delays to reinforcement delivery after a single press on the HR across the four trial blocks were set at 0.4, 1.7, 2.8, and 6.5 s, respectively. Challenge with the 5 mg/kg dose of ketamine increased impulsive responding, with rats making fewer choices of the HR lever that delivered four pellets after a delay. Analysis of the choice data confirmed that this effect was statistically significant (F(1, 7)=9.20, P<0.05; Figure 4b). The main effect of Block was also significant (F(3, 21)=4.87, P<0.01), indicating that rats chose the HR lever less as the delay to reinforcement increased. Although there was no significant Dose × Block interaction (F(3, 21)=0.44, NS), inspection of Figure 4b indicates that the reduced preference for a larger, delayed reward was most pronounced during the latter trial blocks. This dose of ketamine did not significantly alter locomotor counts (F(1, 7)=4.57, NS) or the number of trial omissions (F(1, 7)=2.70, NS). Thus, similar to the effects of flupenthixol, ketamine increases delay discounting, reducing the preference for larger, delayed rewards.

Effort discounting with equivalent delays

After all drug tests using the delay-discounting procedure were complete, rats in group C were then trained for 11 days using the effort discounting with equivalent delays procedure. In contrast to what was observed using either the effort- or delay-discounting procedure, ketamine (5 mg/kg) did not alter choice behavior when the delay to reward delivery after a single response on the LR lever or multiple responses on HR lever was comparable (Figure 4c). The ANOVA indicated no significant main effect of Dose, or Dose × Block interaction (both F's<2.12, NS). Similarly, ketamine did not alter the latency to complete the ratio of presses on the HR lever, the number of trial omissions, or locomotor activity (all F's<2.60, NS). These findings, in combination with the data using the effort and delay-discounting procedures indicate that the effects of ketamine on decision making are unlikely to be related to disruptions in cost–benefit analyses specifically related to effort. Rather, these observations suggest that ketamine reduced tolerance for delayed rewards, independent of the effort requirement associated with a particular choice. Furthermore, these findings demonstrate that noncompetitive NMDA receptor blockade alters cost–benefit analyses in a manner distinct from those induced by dopaminergic manipulations.

Effects of Acute and Long-Term Free Feeding

We assessed the effects of satiety manipulation on decision making using the standard effort-discounting procedure. After rats in group C had received their final drug test (flupenthixol, 0.25 mg/kg), we continued their food restriction and training on effort-discounting procedure for another 3 days. Rats were then given ad libitum access to standard lab chow in their home cages and trained for another 5 days using the standard effort-discounting procedure. Statistical analyses of the various behavioral measures compared the average of each measure taken from the last 3 days of training while on food restriction to (1) the first day of training after giving ad libitum access to food (acute free feeding) and (2) the average of each measure taken from the last 3 days of training when given free access to food (long-term free feeding). Analysis of the choice data revealed significant main effects of Free feeding (F(2, 14)=6.62, P<0.01), Block (F(3, 21)=33.14, P<0.001) and a Free feeding × Block interaction (F(6, 42)=2.69, P<0.05). Simple main effects analyses revealed that acute free feeding decreased significantly (P<0.05) the proportion of choices of the HR lever on the second and third trial blocks (5 and 10 presses, respectively), but not the first or last block, compared to performance during food restriction (Figure 5a). However, after another 4 days of free access to food, choice behavior stabilized, with rats displaying levels of performance comparable to that observed during food restriction. These satiety manipulations also increased the number of trial omissions after acute (13±4) but also after long-term free feeding (8±3) compared to food restriction (4±2) (F(2, 14)=9.74, P<0.005). This effect on omissions was most prominent during the last trial block, after rats had already obtained a substantial amount of food. Both acute and long-term free feeding also caused a slight but statistically significant decrease in locomotor counts (F(2, 14)=6.89, P<0.01). Neither satiety manipulation had a reliable effect on the latencies to complete the ratio of presses on the HR lever (all P's>0.10). Thus, acute free feeding after a prolonged period of food restriction can reduce the preference to exert more effort to obtain a larger reward. In contrast, after long-term free feeding, rats tend to make fewer choices overall, but when they do choose, their patterns of choice are similar to those observed when they are hungry. This latter finding indicates that the level of satiety may have a greater influence on whether an animal chooses to respond or not, but does not have as great an influence on the direction of choice using this procedure.

Figure 5
figure 5

The effects of acute and long-term free feeding on effort-based decision making. Acute access to food ad libitum (gray circles) decreased the proportion of choices of the high-reward (HR) lever compared to performance under food restriction conditions (white squares). However, after 5 days of free feeding, this effect was no longer apparent (black circles).

DISCUSSION

Using a novel effort-discounting procedure, the present study provides important new information about the mechanisms by which DA and NMDA receptor activities mediate different types of cost/benefit decision making. DA receptor blockade reduced the preference for rats to either work harder or wait longer to obtain a larger reward. Moreover, when delay to reward delivery was equalized between response options, DA receptor blockade also reduced the preference for rats to exert more effort to obtain a larger reward. In contrast, increasing DA transmission with amphetamine exerted biphasic effects on decision making. Lower doses increased the tendency for rats to work harder or wait longer for larger rewards, whereas a higher dose reduced the preference for rats to work harder for larger rewards. The effects of lower doses of amphetamine on effort-based decision making were attributable to a reduction in delay discounting. Ketamine also increased both effort and delay discounting, but did not alter choice on the effort discounting with equivalent delays task, suggesting that the effects of this drug observed here are due primarily to alterations in processes related to delay discounting. Furthermore, given that these manipulations did not alter response rates on the HR lever, the effects of these drugs on choice cannot be attributed to impaired motor functioning. Thus, the present findings provide further evidence that effort- and delay-based decision making are pharmacologically dissociable and highlight the utility of these procedures in parsing out the neurochemical mechanisms that mediate these two processes.

Dopaminergic Modulation of Effort vs Delay-Based Decisions

Our findings that DA receptor blockade with flupenthixol made rats more inclined to choose the low-cost/LR option in all three tasks complement previous studies reporting similar alterations in either effort- (Salamone et al, 1994; Denk et al, 2005) or delay (Wade et al, 2000; Denk et al, 2005; Van Gaalen et al, 2006)-based decision making following DA receptor antagonism. It is notable that unlike flupenthixol, reducing motivation for food induced via long-term free feeding did not alter the pattern of choice on the effort-discounting task. This finding further supports the notion that DA transmission does not play a specific role in mediating the primary motivational impact of natural reinforcers such as food, but rather, may be particularly important for overcoming work-related requirements that separate animals from significant stimuli (Salamone et al, 2007).

Previous studies of effort-based decision making have utilized a T-maze version of the task, where rats choose between climbing a barrier and obtain a larger reward in one arm, or entering an open arm to obtain a smaller reward (Salamone et al, 1994; Denk et al, 2005). However, in these tasks, the more effortful response invariably imposes some delay before a larger reward is obtained. This makes it difficult to disentangle whether the DA antagonism reduces the preference to either work harder or wait longer to receive larger rewards, given that DA antagonists also increase delay discounting. Indeed, Denk et al (2005) observed that administration of haloperidol reduced the tendency of rats to choose the HR arm, but also increased response latencies. In the present study, this complication was overcome by equalizing the relative delay cost between response options, so that a choice on the LR lever delivered a smaller reward after delays equivalent to those incurred when rats responded repeatedly on the HR lever. Our findings suggest that although DA mediates both effort- and delay-based decision making, DA antagonists alter choice behavior in these two tasks through separate cognitive processes. A similar conclusion was reached by Mingote et al (2005), who observed that time intervals imposed by schedules with moderate ratio requirements do not appear to mediate the rate-suppressing effects of DA depletions on instrumental responding. The modulation by DA of these different forms of decision making appears to be anatomically dissociable. DA transmission in the anterior cingulate and nucleus accumbens may be of primary importance in mediating effort-based decisions, as DA depletion or local infusion of DA antagonists in either of these regions disrupts this form of decision making (Salamone et al, 1994; Nowend et al, 2001; Schweimer et al, 2005; Schweimer and Hauber, 2006). However, the specific DA receptors that mediate these effects may differ between brain regions. Both D1 and D2 receptors in the accumbens contribute to this form of decision making, yet only D1 receptor blockade in the PFC disrupts these processes (Nowend et al, 2001; Schweimer and Hauber, 2006). In contrast, excitotoxic lesions of medial PFC regions or DA lesions of the accumbens do not alter delay discounting (Cardinal et al, 2000; Winstanley et al, 2005). Rather, DA transmission in the orbital PFC may play a more prominent role in mediating the preference between small/immediate and larger/delayed rewards (Kheramin et al, 2004; Winstanley et al, 2006), but not effort-based decisions (Rudebeck et al, 2006). Thus, these related yet distinct forms of cost/benefit decision making appear to be mediated by different neuroanatomical profiles of increased cortical and striatal DA activity.

Amphetamine induced distinct dose-dependent effects on effort- and delay-based decision making. These effects were likely attributable to increased DA transmission rather than other monoamines because (1) selective norepinephrine uptake blockers do not affect delay discounting (Van Gaalen et al, 2006, 2) the effects of amphetamine on delay discounting are blocked by selective D1 or D2 receptor antagonist (Van Gaalen et al, 2006), and (3) serotonin antagonists do not alter effort-based decision making (Denk et al, 2005). Lower doses of amphetamine increased the tendency of rats to work harder or wait longer to obtain a larger reward, consistent with numerous reports showing that lower doses of amphetamine or selective DA uptake blockers enhance the preference for larger and delayed rewards (Wade et al, 2000; Cardinal et al, 2001; Winstanley et al, 2005; Van Gaalen et al, 2006). Yet, when we equalized the relative delay to food delivery after responding on either the HR or LR lever, the preference for rats to exert more effort to obtain larger rewards was no longer apparent. This combination of findings indicates that lower doses of amphetamine did not increase the tendency for rats to work harder for larger rewards, or enhance the preference for larger rewards in general. Rather, the effects of lower doses of amphetamine on effort discounting may be attributable to an enhanced tolerance for delays to reward incurred when rats were required to press the HR lever repeatedly to obtain a larger reward.

In contrast to the above-mentioned findings, a higher dose of amphetamine (0.5 mg/kg) decreased the preference for the HR lever using both the standard effort-discounting task and the equivalent delays procedure. In a similar vein, Cousins et al (1994) utilized a concurrent lever-pressing/feeding choice procedure, where rats were given the option of either pressing a lever five times to receive a more preferred reward pellet, or free access to standard lab chow. They observed that higher doses of amphetamine (2–3 mg/kg) reduced both lever pressing and chow consumption, and these effects were attributed to either the motor stimulant or appetite-suppressing effects of this drug. These are unlikely explanations for the present findings, because the 0.50 mg/kg dose of amphetamine used here did not affect delay-discounting performance, nor did it increase trial omissions. Furthermore, both the 0.25 and 0.50 mg/kg doses of amphetamine increased locomotion to a similar degree, but had opposite effects on effort discounting. Thus, it is apparent that either reductions or excessive increases in DA activity can disrupt effort-based decision making. These findings complement the well-documented phenomenon that dopaminergic modulation of PFC functioning takes the form of an ‘inverted U-shaped’ function, where too little or too much D1 receptor activity disrupts cognitive functions such as working memory (Arnsten, 1997; Williams and Castner, 2006). Under certain conditions, blockade or supranormal stimulation of D1 receptors in the PFC impairs performance on a variety of delayed response tasks (Zahrt et al, 1997; Floresco and Phillips, 2001; Chudasama and Robbins, 2004; Floresco and Magyar, 2006). In this regard, D1 receptors in the anterior cingulate region of the PFC also mediate effort-based decision making (Schweimer and Hauber, 2006). Therefore, it is plausible that the higher dose of amphetamine used in the present study resulted in excessive D1 receptor activation, which in turn may have hampered patterns of activity in PFC neural networks that normally integrate information about the relative response costs and reward magnitude associated with each response option. Elucidating the effects of local stimulation of D1 receptors in the PFC on these different forms of decision making awaits further investigation.

Increases in Impulsive Responding Induced by Ketamine

Blockade of NMDA channels with ketamine also reduced the preference for the HR lever using both the effort- and delay-discounting procedures. However, when rats were subsequently trained using the effort discounting with equivalent delays procedure, ketamine did not alter choice behavior. It is unlikely that this lack of effect was due to overtraining, because flupenthixol was still effective in altering choice behavior on this task in these same animals. It is important to note that ketamine also acts on 5-HT2 receptors, and in particular, exerts partial agonist actions on high-affinity D2 receptors (Kapur and Seeman, 2002). Therefore, we cannot rule out that ketamine-induced increases in impulsive choice may be mediated through actions on these other receptors, in addition to NMDA channels. However, given that increased DA activity typically decreases impulsive responding, it is unlikely that the effects of ketamine reported here were due to activation of D2 receptors. Similarly, blockade of 5 HT2A and C receptors also reduces impulsive choice (Talpos et al, 2006). Nevertheless, the fact remains that unlike dopaminergic agents ketamine exerts a selective effect on decision-making processes related to delay discounting, but not on effort-based decisions per se. To our knowledge, this is the first demonstration that this class of NMDA antagonist increases impulsive responding for small/immediate rewards. Moreover, these findings further highlight the utility of these effort-discounting procedures in pharmacologically dissociating the neurochemical mechanisms that underlie these different, yet interrelated forms of decision making.

The effects of ketamine on delay discounting may be related to its actions of this type of compound on short interval timing. For example, noncompetitive NMDA antagonism with MK 801 produces an overestimation in time in rats performing a peak interval procedure (Miller et al, 2006). In the present study, ketamine may have induced a similar effect, causing an overestimation of the delay to reward, which would be expected to further bias the animal's choice toward immediate/smaller rewards. These effects may be mediated in part through actions on the hippocampus, given that ketamine disrupts hippocampal functioning (Greene, 2001) and lesions of the hippocampus increase impulsive responding using a similar delay-discounting procedure (Cheung and Cardinal, 2005). Further insight into the neural mechanism by which ketamine may alter delay-based decision making comes from neurophysiological studies in awake, behaving rats. Roesch et al (2006) observed that a substantially greater proportion of neurons in the orbital PFC increased firing in response to smaller, immediate rewards when compared to the number of cells that increased activity prior to larger delayed rewards. Furthermore, many of these neurons that increased firing for small/delayed rewards predicted future behavior of the animals on free-choice trials. Based on these findings, the authors posited that the dominant output signal from the orbital PFC biases responses toward immediate rewards. With respect to the present study, ketamine has been shown to increase glucose utilization in the orbital PFC (Duncan et al, 1998), and NMDA channel blockade with MK 801 increases firing of medial PFC neurons in awake rats (Jackson et al, 2004; Homayoun et al, 2005; Homayoun and Moghaddam, 2006). When viewed collectively, it is reasonable to propose that ketamine may disproportionately increase firing of a subpopulation of orbital PFC neurons that encode for immediate rewards, enhancing impulsive responding by further biasing choice behavior toward immediate yet smaller rewards.

It is interesting to note that administration of ketamine in humans causes perceptual and cognitive abnormalities reminiscent of the ‘positive’ and ‘negative’ symptoms of schizophrenia. Furthermore, it is well established that ketamine or related compounds impair other cognitive functions mediated by different subregions of the PFC that are disrupted in schizophrenia, including working memory (Verma and Moghaddam, 1996) and behavioral flexibility (Idris et al, 2005; Stefani and Moghaddam, 2005). Interestingly, a recent study of impulsive decision making has revealed that patients with schizophrenia are more prone to choosing immediate over long-term rewards, in a manner similar to rats treated with ketamine in the present study (Heerey et al, 2007). Thus, the use of noncompetitive NMDA antagonists such as ketamine in experimental animals may have utility in modeling impulsive decision making observed in this disorder.

SUMMARY AND CONCLUSIONS

It is becoming increasingly apparent that different forms of cost/benefit decision making are subserved by overlapping yet dissociable neural circuits linking limbic and striatal areas, such as the amygdala and ventral striatum, to separate regions of the frontal lobes (Winstanley et al, 2005; Rudebeck et al, 2006; Floresco and Ghods-Sharifi, 2007). With the use of these novel decision making assays, the present study provides important new insight into the dissociable neurochemical mechanisms that mediate effort- and delay-based decisions. Administration of a DA receptor antagonist revealed that DA mediates both processes, but that the role of DA in effort-based decision making is independent of its mediation of impulsive choice. Furthermore, increases in DA transmission with amphetamine can exert differential effects on these two types of decisions, with higher doses disrupting effort (but not delay)-based decision making. In contrast, NMDA receptors appear to play a more selective role in biasing choices between small/immediate and large/delayed rewards. Obtaining a more comprehensive understanding about the neurochemical bases of these different forms of decision making may help to clarify how perturbations in dopaminergic and glutamatergic signaling can lead to alterations in decision making that are observed in a number of psychiatric disorders.