Published online by Cambridge University Press: 01 November 2004
HIV-seropositive (HIV+) drug users show impaired performance on measures of integrity of prefrontal–subcortical systems. The Iowa Gambling Task (GT) is mediated primarily through ventromedial–prefrontal systems, and poor performance on this measure (“cognitive impulsivity”) is common among substance dependent individuals (SDIs) as well as patients with disease involving prefrontal–subcortical systems (e.g., Huntington disease). We hypothesized that HIV+ SDIs might be more vulnerable to cognitive impulsivity when compared with HIV-seronegative (HIV−) SDIs because recent studies report evidence of additive effects of HIV serostatus and drug dependence on cognition. Further, working memory is considered a key component of GT performance and is reliably impaired among HIV+ SDIs compared to controls. We administered the GT to 46 HIV+ and 47 well-matched HIV− males with a past or current history of substance dependence. In addition, we evaluated correlations between subjects' scores on the GT and on a delayed nonmatch to sample (DNMS) task in order to test if working memory deficits accounted for cognitive impulsivity among the HIV+ subjects. The HIV+ subjects performed significantly more poorly on the GT compared to the HIV− group but this effect could not be explained by working memory deficits. Implications of these findings for future basic and applied studies of HIV and substance dependence are discussed. (JINS, 2004, 10, 931–938.)
HIV has a relative affinity for basal ganglia and prefrontal cortex. Accordingly, HIV-seropositive (HIV+) persons show relatively selective defects compared with seronegative (HIV−) risk-matched controls in cognitive functions mediated through prefrontal subcortical networks. In a series of studies of HIV+ and HIV− substance dependent individuals (SDIs) we have demonstrated a reliable pattern of defects in mental operations of this type, including controlled processing, response inhibition, and working memory (Farinpour et al., 2000; Martin et al., 2001). The purpose of the current study is to evaluate the integrity of a more complex cognitive process that is mediated through prefrontal-subcortical networks. Specifically, we investigated the status of decision-making, a function defined by Bechara, Damasio and colleagues (Bechara et al., 1997, 2000) as “the ability to select the most advantageous response from an array of [immediate] possible behavioral choices.”a
The term “decision-making” has been employed in multiple models in the cognitive and clinical neuropsychological literature. We employed the Iowa Gambling Task (GT), which is designed specifically to capture components critical to the Bechara/Damasio model of decision-making. Throughout this article, “decision-making” refers exclusively to the Bechara/Damasio operational definition and whenever possible we refer to more specific cognitive processes tapped by this task. Deficits on the GT have been termed “cognitive impulsivity.” Throughout this manuscript we use the terms “GT deficits” and “cognitive impulsivity” interchangeably.
According to the Bechara/Damasio model, multiple cognitive and affective processes influence response selection in real time. These include but are not limited to retrieval of representations of previous rewards and punishments associated with each available behavioral choice; temporary maintenance of these representations online in working memory; level of responsivity to rewards and punishments; and planning for an optimal future outcome or goal. The model predicts that deficits in any of these operations result in impaired decision-making, or cognitive impulsivity, selections biased toward the response choice associated with the greatest immediate reward, regardless of punishment or of the future consequences of this behavior.
Bechara and his colleagues introduced the Iowa Gambling Task (GT) to assess various cognitive components of the decision-making process in real time and reported that deficits on this task are characteristic among clinical groups that include persons with focal lesions of ventromedial prefrontal cortex or amygdala (see Bechara et al., 2000, for a review). In addition, the Iowa group and other investigators (Bechara et al., 2001; Bolla et al., 2003; Grant et al., 2000) have reported that SDIs perform the GT more poorly as a group compared to matched non-drug using controls. Additionally, SDIs' performance patterns typically resemble those of patients with ventromedial prefrontal lesions, although their deficit is less severe. This finding is not unexpected as SDIs and VM patients share several common neurocognitive characteristics such as impulsivity, inability to modify ineffective response strategies, and seeming indifference toward incorrect responding. Further, SDIs show evidence on functional neuroimaging studies of abnormal activity in orbitofrontal cortex, a subregion of ventromedial prefrontal cortex (Goldstein & Volkow, 2002; Grant et al., 2000; Volkow & Fowler, 2000).
Although SDIs as a group are impaired on the GT, only a subgroup of individuals actually perform the task abnormally (Bechara & Damasio, 2002; Bechara et al., 2001, 2002; Bechara & Martin, 2004) and one question concerns which variables are associated with increased risk of impaired GT performance among this group. In this study we investigated the possibility that among SDIs, HIV+ persons might be more vulnerable to GT deficits compared with HIV− controls. Rationale for this hypothesis is based on recent evidence that HIV and drugs of abuse show additive deleterious effects on neurocognitive functions mediated by prefrontal–subcortical neural systems; and that patients with Huntington disease or other frontal–striatal disorders also show impairment on the GT (Rippeth et al., 2004; Stout et al., 2001).
Working memory deficits are prominent among HIV+ SDIs compared with HIV− controls. Notably, Bechara and Martin (2004) recently reported that GT deficits were associated significantly with working memory deficits in a sample of seronegative users of primarily methamphetamine, alcohol and cocaine. Scores on the working memory task employed in the Bechara and Martin study were available for our subjects and have been reported in a separate publication (Martin et al., 2003). Accordingly we employed these scores to test a third hypothesis, that impaired working memory performance would account at least in part for HIV+ SDIs' GT deficits.
Table 1 shows subject characteristics for the two groups. We enrolled and tested a total of 46 HIV+ and 47 ELISA-verified HIV− males aged 18–55 years and diagnosed with current or previous substance dependence. Subjects were recruited from infectious disease and substance abuse clinics at the VA Chicago Health Care System–West Side, the University of Illinois–Chicago HIV Early Intervention Clinic at Mile Square Health Center, community HIV and drug treatment programs and by word-of-mouth among residents of recovery houses and shelters. Subject ethnicity was 91% African American, 4% Hispanic and 5% White. All subjects were medically stable with no history of closed head injury with loss of consciousness greater than 30 min, open head injury, schizophrenia, seizure disorder, current alcohol dependence, current neuroleptic use, or less than 10 years of education. Subjects showed no evidence of overt cognitive deficits on screening interview and no history of neurologic impairment by medical record review. All subjects were capable of providing informed consent for the study, which was approved by the UIC Institutional Review Board.
Participant characteristics
The HIV+ subjects had no history of dementia or other neurologic AIDS-defining disorders and their median CD4 lymphocyte count was 521 at testing (range 41–2040). Approximately 10% of the group had AIDS-defining CD4 counts (<200). Eighty percent were prescribed antiretroviral therapy at testing, 50% with highly active antiretroviral therapy (HAART), that is, combinations that included at least one protease inhibitor, and 30% with reverse transcriptase inhibitor combinations or monotherapy. Plasma viral load was undetectable at less than 400 copies/ml for 31% of subjects. Eighty-five percent of subjects complained of at least one constitutional symptom in the previous week.
All subjects underwent Breathalyzer testing and rapid urine toxicology screening for cocaine, cannabis and opiatesb
Subjects' urine specimens were also tested for a more extensive panel of street drugs with confirmation. No additional drug using subjects were identified on extended testing in this study group.
All subjects were administered the Structured Clinical Interview for DSM–IV–Substance Abuse Module (SCID–SAM; First et al., 1995) to verify substance dependence diagnoses. Approximately 89% of subjects were diagnosed with a history of cocaine dependence, 80% with alcohol and 68% with heroin. Eighty-five percent of subjects met diagnostic criteria for polysubstance dependence. Sixty percent of subjects reported a history of injection drug use (IDU).
Subjects also completed the Addiction Severity Index (McLellan et al., 1985), a standardized measure employed routinely in clinical evaluations of substance abusers in order to estimate severity of drug and alcohol abuse, as well as associated social problems and medical complications.
In addition, all subjects completed a series of standardized paper and pencil measures of personality traits or disorders comorbid with substance dependence and with potentially confounding effects on the cognitive data. These variables included history and symptoms of attention deficit disorder, antisocial personality traits, sensation seeking, and symptoms of post-traumatic stress disorder, indexed respectively by the Wender Utah Rating Scale (Ward et al., 1993), the Socialization subscale of the California Personality Inventory (Rosen & Schalling, 1974), the Sensation Seeking Scale–Version V (Zuckerman, 1996), and the PTSD Checklist-Civilian version (Keane et al., 1987). In addition, subjects completed the Wide Range Achievement Test–III (WRAT–III; Wilkinson, 1993) Oral Reading subtest in order to screen for potential reading disorders. Controlling for these variables is necessary in any investigation of neurocognition in substance abusers and is particularly relevant to the current investigation, since the literature indicates that some of these comorbid conditions (i.e., attention deficit disorder; antisocial personality traits) are associated with deficits on the GT (M. Ernst et al., 2003; Mitchell et al., 2002).
Finally, all subjects completed the Beck Depression Inventory (Beck et al., 1961) and the state version of the State–Trait Anxiety Inventory (Spielberger et al., 1971) to monitor current psychological distress at the time of testing.
We informed all subjects explicitly that no study data, including Breathalyzer readings, toxicology screening results or self-report of substance use, would be entered in their medical charts or provided without their written consent to any individual or facility, including their health care or substance abuse treatment providers, law enforcement agencies or the courts. Subjects were also informed that their data were protected from subpoena by a Certificate of Confidentiality obtained from the National Institute on Drug Abuse. These assurances are known to increase the reliability of drug abusers' self-reported alcohol and drug use because negative consequences of disclosure are minimized (Darke, 1998). In this regard, whenever possible we employed self-administered computerized versions of measures of addiction severity, which are known to increase subjects' rate of endorsement of substance use (Macalino et al., 2002).
We administered the gambling task (GT) on a Dell Optiplex Pentium PC using the standard task materials and subject instructions employed by the Iowa group. A continuous display of four card decks labeled A, B, C, and D remained on the screen for the duration of the task. Participants were instructed to select cards one at a time by clicking on any of the four pictured decks. Participants were also told that: each and every card selection would result in a win of some amount of money; occasionally a selection would result in a loss of money as well; the amount of money won or lost would vary across selections; some of the card decks were associated with higher wins than others; and that their job was to win as much money as possible and complete the task with a winning score.
Each participant made 100 selections from the decks, administered as five blocks of 20 trials each. The order of rewards and punishments within each deck was identical for all subjects. Unbeknownst to participants, selections from decks A and B (“bad decks”) resulted in large wins but occasional large losses. Choices from decks C and D (“good decks”) resulted in smaller wins and occasional smaller losses compared with choices from the bad decks. Consistent choices from good decks provided small immediate rewards but resulted in a net gain at the end of 100 trials, while choices primarily from bad decks provided large immediate rewards but an overall total loss. All subjects typically select cards primarily from the bad decks at the start of data collection; however, subjects without neurologic dysfunction typically shift their choices primarily to good decks over trial blocks and complete the task with a winning score (Bechara et al., 2000).
All subjects had participated previously in an investigation of working memory performance using a delayed non-match to sample (DNMS) task employed by the Iowa group in previous GT studies (Bechara & Martin, 2004). This task stresses working memory mechanisms by requiring subjects to retain information online during a delay period prior to making a response choice. In a separate publication (Martin et al., 2003) we reported that HIV+ subjects performed the DNMS significantly more poorly compared with HIV− subjects (see Figure 1).
Working memory performance for HIV+ and HIV− groups on a delayed non-match to sample task (data reported in Martin et al., 2003). Scores represent percent correct choices for 10, 30, and 60 s delay periods.
A detailed description of the DNMS procedure is available in Martin et al. (2003). Briefly, at the start of each trial subjects view a brief display of a single red or black card. Following a time delay of 10, 30 or 60 s after stimulus offset, a new display of two red and two black cards appears and subjects are asked to select two cards. Subjects perform a reading task during the delay periods to prevent rehearsal. A correct choice consists of the cards differing in color from the initial display. Subjects receive no explicit instructions regarding the rule governing response selection; prior to data collection subjects complete a block of practice trials with no time delay in order to insure they have deduced the rule correctly.
Table 1 shows demographic data and mean scores on measures of comorbid and potentially confounding conditions. HIV+ and HIV− groups did not differ significantly in mean age, education, estimated Verbal IQ by the AmNART, or scores on the WRAT-III Oral Reading, the STAI, or the BDI, t < 1 for each comparison with the exception of the BDI [t(90) = 1.53, p =. 13]. There were no significant group differences for mean scores on measures of comorbid personality traits or disorders, p > .20 for all tests, with the exception of a higher mean score for the controls on the Sensation Seeking Scale [t(91) = 2.39, p < .05]. Thus, groups were well matched on demographic variables, potential confounds, level of current psychological distress and estimated intellectual function.
Table 2 shows substance dependence characteristics for the two groups. We found no statistically significant differences between HIV− and HIV+ groups in prevalence of SCID–SAM diagnoses of current or previous cocaine [χ2(1) = 2.96, p < .09], heroin, alcohol or polysubstance dependence (χ2 < 1 in each instance). Compared to HIV+ subjects the HIV− group scored significantly higher on the ASI–Drug subscale [t(90) = 2.47, p < .05]; reported significantly more years of drug abuse [t(90) = 2.26, p < .05]; and reported a marginally significant trend toward fewer days of abstinence since their last drug use [Mann-Whitney z = −1.89, p < .06], indicating greater severity of substance use among the control group. These findings indicate that any GT deficits shown by the HIV+ subjects could not be attributed to group differences in drug abuse severity.
Substance dependence characteristics for participant groups
We analyzed GT performance using mixed design ANOVA, with serostatus as the between factor and trial block as the within factor.
Figure 2 shows the mean number of selections from the “bad decks” (risky choices) over five blocks of 20 trials each for the HIV+ and HIV− groups. We found an expected significant linear trend for trial block [F(1,89) = 7.42, p < .01]; inspection of the means indicated that total risky choices decreased over trial blocks. There was also a significant main effect for serostatus [F(1,89) = 4.44, p < .05], indicating that the seronegative group outperformed the seropositive group overall. The Block × Serostatus interaction was not statistically significant, F < 1.
Mean number of selections from the Gambling Task “bad decks” by HIV+ and HIV− groups over five trial blocks.
Decision-making total scores (total number of risky choices) did not correlate significantly with scores for sensation seeking, socialization, current psychological distress, ADD symptoms or PTSD symptoms (−.15 < r < .15 in each instance).
GT scores did not differ significantly between HIV+ subjects with detectable versus undetectable viral load, F < 1. We could not compare subjects with and without a current immunologic AIDS diagnosis because only 5 subjects had CD4 lymphocyte counts below 200. Similarly, the very small percentage of subjects who were not receiving antiretroviral therapy at testing precluded valid comparisons of GT scores for treated vs untreated patients. However, we noted that subjects currently prescribed HAART performed significantly better on the GT compared with subjects who were untreated or prescribed with reverse transcriptase inhibitors only [t(39) = 2.13, p < .05].
Using the DNMS findings reported by Martin et al. (2003; see Figure 1), we computed an index of overall working memory performance by averaging the three scores obtained by each subject on this task (i.e., percent correct responses for ISIs of 10, 30, and 60 s). In order to investigate the association between working memory deficits and impaired GT performance we computed the correlation between GT total scores with averaged working memory scores. We intended to perform an analysis of covariance (ANCOVA) of GT scores using the working memory index scores as the covariate. However, these scores were essentially uncorrelated (r = −.08, p > .43), thus obviating the planned analysis of GT scores with working memory scores covaried. Additionally, we performed an ANCOVA using DNMS scores for the 30-s delay only (mean scores for the subject groups differed maximally for this delay period) but results were unchanged.
We compared the performance of HIV+ with HIV− substance dependent individuals (SDIs) on a simulated gambling task designed to measure decision-making as defined by the Bechara/Damasio model; that is, the process of selecting a current behavioral response that will optimize future outcome by maximizing total gain and minimizing losses. Many SDIs perform this task abnormally compared to normal controls, and as a group their pattern of impairment typically resembles that of patients with focal lesions of ventromedial prefrontal cortex, although SDIs' defects are less common and less severe.
We anticipated that a positive HIV serostatus might be associated with increased vulnerability to GT deficits (also known as cognitive impulsivity) since abnormal prefrontal cerebral activity is common among both HIV+ persons and SDIs (Chang et al., 2001; T. Ernst et al., 2002; Volkow et al., 2001) and HIV+ subjects show deficits in several mental operations that are hypothesized to contribute to GT performance, such as working memory and response inhibition (Hardy & Hinkin, 2002). Our findings confirmed this prediction: we found that HIV+ SDIs made significantly more disadvantageous choices on the GT (e.g., selected significantly more cards from the higher risk, or bad decks) compared with HIV− controls, indicating a higher level of cognitive impulsivity among the seropositive group.
Our subject groups were well-matched on demographic characteristics, current psychological distress, prevalence of DSM–IV diagnoses of substance dependence, and measures of comorbid conditions associated with substance dependence that can confound cognitive performance, including reading disorder, history of ADD, antisocial personality traits, and symptoms of post-traumatic stress disorder. Groups showed significant differences on indices of drug abuse severity including years of drug abuse and mean scores on the Addictions Severity Index Drug scale, but in the direction of greater severity among the seronegative subjects. Consequently, the HIV-associated defect in decision-making cannot be attributed to group differences in years of drug use or ASI scores. However, additional variables can be used to index substance use severity and merit investigation as mediating variables in GT performance of HIV+ subjects. In this regard, Bechara and Martin (2004) reported preliminary observations that GT defects were significantly more common among (seronegative) methamphetamine dependent subjects (88%) compared to subjects primarily dependent on alcohol (50%) although groups did not differ in demographic characteristics, days abstinent or years of drug use.
The majority of HIV+ subjects were prescribed with antiretroviral therapies and very few had AIDS defining CD4 counts at testing. These and earlier findings reported by our group suggest that some cognitive deficit might persist despite successful immune restoration (Martin et al., 2003), which would also be consistent with the lack of significant differences we observed in GT performance between HIV+ subjects with detectable versus undetectable (plasma) viral load. Additionally, we observed that HIV+ subjects currently prescribed HAART made significantly fewer risky card choices compared with subjects not currently prescribed HAART. This observation pertains less directly to this study's primary focus and was not investigated in detail but this finding certainly merits additional study given its possible implications both for treatment effects and cognitive components of successful adherence with antiretroviral therapy.
Our groups' GT performance pattern across trial blocks resembles closely the findings for SDIs reported by Bechara et al. (2001). Notably, the demographic and substance abuse characteristics of the Iowa and Chicago samples differ markedly (i.e., White alcohol and methamphetamine users comprised the Iowa subject group compared with our predominantly African American heroin and crack cocaine users). This common pattern of task performance across drug-using groups is consistent with a central tenet of current neurocognitive models of addictive processes; that is, that in addition to idiosyncratic effects of each drug, all drugs of abuse act through a common pathway that includes ventral striatum, amygdala, and orbitofrontal cortex (Goldstein & Volkow, 2002; Volkow & Fowler, 2000). This model implies that users of different drugs might share some common cognitive defects, and our findings suggest that decision-making might represent one such broadly susceptible function.
Our subjects' scores on a delayed nonmatch to sample task did not correlate significantly with GT scores and thus we were unable to demonstrate a significant working memory component to HIV+ subjects' deficits on the GT. However, working memory systems can be stressed by manipulating different task parameters such as memory load, processing complexity, or time delay between stimulus presentation and response choice. Our task manipulated time delay but held the other two parameters constant. The possibility that these additional elements of working memory processing contribute to cognitive impulsivity in HIV+ subjects should be investigated further. In this regard, Hinson et al. (2002) reported that normal subjects' GT performance was associated significantly with scores on a measure that stressed working memory by increasing memory load, suggesting that measures such as the n-back procedure (e.g., Hinkin et al., 2002) might correlate more highly with GT performance.
Notably, the Iowa group has reported that only a subgroup of SDIs obtain individual GT scores in the abnormal range. It is possible that a comparable pattern of findings is characteristic of HIV+ SDIs as well. This speculation is consistent with the extensive literature reporting that cognitive impairment is not invariably present among HIV+ individuals and can range in severity from abnormal cognitive performance with no concurrent impairment in daily function to a frank dementia. It is not yet clear if HIV amplifies the mechanisms responsible for impaired decision-making among HIV− SDIs or affects additional neurocognitive components of the decision-making process. Further investigation of risk factors for decision making deficits among HIV+ SDIs is indicated.
It is critical to emphasize that the GT is a multifactorial task with multiple candidate mechanisms of impaired performance. The broader question of cognitive mechanisms of GT performance has been addressed successfully by studies that employed variants of the original GT with manipulation of parameters such as schedules and types of reinforcement to fractionate decision-making performance. Performance patterns of different clinical groups can be differentiated using these task variants (Bechara et al., 2002; Manes et al., 2002; Ornstein et al., 2000). In addition, evidence suggests that within-group performance can be subtyped according to differing mechanisms of GT performance. For example, Bechara et al. (2001) reported that poor GT performance could be attributed to insensitivity to future consequences (“myopia for the future”) among one subgroup of SDIs and to hypersensitivity to reward among another. We are employing Bechara's variants of the original GT currently to determine if HIV+ SDIs can also be subtyped according to mechanism of decision-making deficit.
Carefully designed functional neuroimaging studies also have significant potential to investigate underlying neural substrates of deficits in GT performance among HIV+ SDIs. A recent PET study by Bolla et al. (2003) documented patterns of abnormal orbitofrontal and dorsolateral prefrontal activity during GT performance by abstinent cocaine users. Recent fMRI studies of cognition in HIV+ subjects have employed successfully neurocognitive probes of working memory (Chang et al., 2001; T. Ernst et al., 2002). Findings from these studies could be integrated to generate more specific and testable hypotheses about neural systems underlying cognitive impulsivity among HIV+ SDIs.
Functional neuroimaging studies also have the potential to move our understanding of HIV and substance dependence forward by expanding the focus of inquiry to include limbic systems accorded a critical role in addictive processes, such as ventral striatum and nucleus accumbens. Carefully designed neurocognitive probes have been employed successfully to differentiate neural circuitry underlying working memory from systems underlying processing of rewards. For example, Pochon et al. (2001) tested normal subjects using identical stimuli under conditions that manipulated either working memory demands or reward salience. They reported that fMRI identified limbic and prefrontal cortical brain regions active during working memory performance or in response to changes in reward salience or under both conditions. Investigation of the integrity of these systems might shed further light on specific mechanisms of HIV-associated GT deficits among SDIs as well as potential additive effects of HIV and drug abuse on cognition.
The construct of decision-making is a natural choice to understand the cognitive components of SDIs' persisting engagement in behavior such as sexual and injection practices that put them at risk for exposure to or transmission of HIV and additional medical problems such as hepatitis C virus (HCV) despite full knowledge of both the long term health risks and prevention/harm reduction strategies. In addition, because both immediate reward contingencies and future outcome influence behavioral choice, decision making is particularly relevant when modeling cognitive aspects of medical compliance such as adherence to antiretroviral and/or interferon therapy regimens. Studies of these issues by our group are currently in progress.
Our subject groups do not represent the full spectrum of SDIs. We tested males recruited primarily from treatment settings who tested negative on toxicology screens and breath tests. Consequently, we cannot generalize our findings uncritically to women, out-of-treatment addicts or subjects who continue to use drugs while in treatment. Nonetheless, our findings of differences in cognitive impulsivity according to HIV serostatus move our understanding of HIV-related cognitive defects forward and can broaden our areas of inquiry to include limbic and more well-defined prefrontal neural systems.
We thank Mary Beth Graham, Jose Bolanos, Pyrai Vaughan, and Joe Sangster for subject referrals, Stephanie King for assistance with data collection and manuscript preparation, Jon Spradling for computer programming, Charles Valenta for assistance with data entry and Raul Gonzalez and three anonymous reviewers for valuable critiques of the manuscript. Supported by HHS R01 DA12828 to Eileen Martin.
Participant characteristics
Working memory performance for HIV+ and HIV− groups on a delayed non-match to sample task (data reported in Martin et al., 2003). Scores represent percent correct choices for 10, 30, and 60 s delay periods.
Substance dependence characteristics for participant groups
Mean number of selections from the Gambling Task “bad decks” by HIV+ and HIV− groups over five trial blocks.