Introduction

Atrial fibrillation (AF) currently affects at least 33.5 million adults in the world population, not including subclinical or undiagnosed AF cases [1, 2], and the global prevalence of documented AF is probably underestimated because of limited data outside Europe and North America [1, 3].

Recent population-based studies and stroke registries consistently report a substantial AF-attributable risk of stroke, particularly in the elderly [4]. Approximately 1 in 3–4 patients presenting with an ischemic stroke will also have AF (either already known or first diagnosed at the time of acute stroke, or documented during the post-stroke monitoring) [4, 5]. In comparison to strokes from other causes, AF-related strokes are more often fatal or associated with greater permanent neurological deficit [6], but can be effectively prevented using oral anticoagulant therapy (OAC) with well-controlled vitamin K antagonists (VKAs) [7] or non-vitamin K antagonist oral anticoagulants (NOACs) apixaban [8], rivaroxaban [9], dabigatran [10], or edoxaban [11]. The use of OAC is also associated with significant reduction in all-cause mortality in AF patients, by 26% with VKAs vs. control/placebo [7], and by additional 10% risk reduction with NOACs relative to warfarin according to meta-analysis [12].

The decision to use OAC in individual AF patient is based on the estimated balance of the benefit from ischemic stroke reduction against the risk of major OAC-related bleeding [essentially intracranial hemorrhage (ICH)]. Better appreciation of the importance of VKAs’ anticoagulation quality [a target time in therapeutic range (TTR) of ≥70%] and the availability of NOACs (which offer better safety compared to VKAs) [12] have decreased the estimated threshold for OAC treatment in AF patients from 1.7% (as estimated for the VKAs standard treatment) to 0.9% annual stroke risk, assuming that NOACs are available [13].

Still, contemporary registry-based data show that OAC is often underused in AF patients at increased risk of stroke [14,15,16,17,18]. The uncertainty whether to use OAC may be particularly pronounced in AF patients with a single additional stroke risk factor, who are often (mis)perceived as having a “borderline” or insufficient stroke risk to trigger the use of OAC. Also, there is some inconsistency in formal AF guidelines regarding the use of OAC in this subset of AF patients, as shown in Table 1 [19,20,21,22,23,24,25].

Table 1 Guideline recommendation for thromboprophylaxis in AF patients with a single additional stroke risk factor

In this review article, we summarize the basic principles of stroke risk stratification in AF patients and discuss contemporary real-world evidence on OAC use and outcomes of OAC treatment in AF patients with a single additional stroke risk factor in various real-world AF cohorts. This article is based on previously conducted studies and does not involve any new studies of human or animal subjects performed by any of the authors.

Stroke Risk Stratification and Thromboprophylaxis in Patients with Atrial Fibrillation

Patients with AF have an excessive risk of stroke compared to their counterparts without AF but individual stroke risk is not homogeneous and depends on the presence (or absence) of various stroke risk factors [26]. To facilitate the assessment of AF-related risk of stroke in clinical practice, established clinical stroke risk factors derived from the control or placebo arms of historical trials on stroke prevention in AF [7] or large observational AF cohorts have been combined into various stroke risk scores, such as the CHADS2 or CHA2DS2-VASc score [27] (Table 2), the latter being more inclusive of relevant clinical stroke risk factors in comparison to the CHADS2 score [28, 29]. Although simple, the CHADS2 score has well-recognized limitations [30] including poor identification of AF patients at truly low risk of stroke. For example, it has been shown that patients with a CHADS2 of 0 (presumably low-risk patients) have had an annual stroke rate as high as 3.2% [28].

Table 2 Stroke risk assessment tools in AF

The CHA2DS2-VASc score has been validated in a number of independent cohorts [28, 31,32,33,34,35], and is the recommended tool for stroke risk assessment in most of the latest formal AF guidelines [20, 24]. Compared to other AF-related stroke risk scores, CHA2DS2-VASc is reasonably simple, which is necessary for widespread use in routine clinical practice, and performs well, especially in reliable identification of AF patients at truly low risk of stroke (i.e., those with no additional stroke risk factors) who do not need any thromboprophylaxis [28, 36,37,38,39]. Recent comparisons of the more complex ATRIA score against the CHA2DS2-VASc score yielded conflicting results, although CHA2DS2-VASc generally outperforms the ATRIA score for risk prediction [35, 37, 40, 41]. Of note, the ATRIA score includes the indices of renal function (i.e., estimated glomerular filtration rate and proteinuria), which may not be readily available in a busy outpatient clinic or hospital ward, and uses a complicated 10-year graded scale for age to calculate the score value separately for patients with and without prior stroke or TIA [38].

Adding in various biomarkers [e.g., biomarkers of cardiac function such as cardiac troponin or N-terminal fragment B-type natriuretic peptide (NT-proBNP)] [42, 43] or various cardiac imaging modalities (e.g., left atrial size, morphology and function, left atrial fibrosis) [44] has been shown to improve the predictive value of clinical risk factor based scores such as the CHA2DS2-VASc score, but mainly by further substratification of high-risk patients as identified by the “classic” CHA2DS2-VASc score. Since the CHA2DS2-VASc ≥2 patients already have a clear indication for OAC [20, 23, 24], further quantification of their stroke risk would not really influence clinical decision-making and is of little practical value. Nevertheless, in selected AF patients without “classical” CHA2DS2-VASc stroke risk factors (e.g., a 56-year-old man with a family history of stroke) or those in whom the estimated risk of bleeding apparently exceeds the risk of stroke (e.g., in AF patients with a single additional stroke risk factor and a prior bleeding event, or taking dual antiplatelet therapy post acute coronary syndrome), a refinement in stroke risk assessment using biomarkers and/or imaging modalities could possibly inform the decision to use OAC. However, more research is needed to define the biomarker cutoff values and the time course of blood sampling which would be informative for stroke risk assessment in clinical practice [45].

Female sex-related risk of stroke in AF has been extensively debated. In a large meta-analysis, which included 30 studies with a total of nearly 4.5 million participants, the presence of AF in female individuals has been associated with greater relative risk of stroke, all-cause mortality, and cardiovascular outcomes compared to men [46]. However, the risk of stroke in women with AF has been shown to be age-dependent. Whilst younger female individuals with AF had similar or even lower risk of stroke compared to their age-matched male counterparts, elderly female AF patients were at higher risk of stroke than the age-matched male patients [47,48,49,50,51]. Possible mechanisms of sex-related differences in the risk of AF-related stroke are still not fully elucidated [52].

Female sex has been assigned 1 point in the CHA2DS2-VASc score [27]. Hence, a female AF patient without additional CHA2DS2-VASc stroke risk factors will have a score of 1, but should be treated the same as male AF patients with a CHA2DS2-VASc score of 0 [20, 24, 25]. These truly low risk AF patients (i.e., men with CHA2DS2-VASc = 0 and women with CHA2DS2-VASc = 1) have low annual stroke rates of ≤1% [32, 53] and the use of antithrombotic therapies in such patients has been associated with a neutral or negative relationship with stroke, bleeding, or death in several large cohorts [54,55,56,57].

“Real-World” Rates of Stroke in Non-anticoagulated Patients with Atrial Fibrillation and One Additional Stroke Risk Factor

Figure 1 shows the annual rates of stroke in AF patients without additional stroke risk factors (i.e., with a CHA2DS2-VASc score of 0 [men] or 1 [women]) and in those with a single additional stroke risk factor, as observed in various real-world AF cohorts. In most of the studies stroke rates are significantly increased with the presence of one additional stroke risk factor in non-anticoagulated AF patients.

Fig. 1
figure 1

Real-world stroke rates in AF patients with and without a single additional stroke risk factor [31, 33, 35, 41, 54, 56, 57, 59,60,61, 67, 129]. Komatsu et al. and Suzuki et al. reported stroke rates based on no OAC at baseline, but there was no record on whether OAC treatment was started during follow-up (hence, the reported stroke rates may be artificially low). Friberg et al. and Aspberg et al. reported stroke rates only in AF patients who were never prescribed OAC, starting from baseline throughout the follow-up (i.e., a conditioning on the future)

Annual stroke rates in AF patients with a single additional stroke risk factor observed in these [37, 54, 56, 58,59,60,61] and other observational cohorts [31, 33, 35, 57, 62,63,64,65,66,67] are shown in Table 3. Overall, the stroke rates ranged from 0.5% to 2.75%, or to 6.60% in the study with the highest annual stroke rate (Table 3). Such variability in stroke rates most probably results from variable methodology, anticoagulation status, and outcome definitions in the studies [30, 68]. For example, the study by Suzuki et al. was based on non-OAC use at baseline, but OAC status at follow-up was unknown, such that the unusually low stroke event rate could be related to some high-risk patients being started on OAC during follow-up [61].

Table 3 Rates of stroke in non-anticoagulated “real-world” AF patients with single additional stroke risk factor

Friberg et al. showed how variable duration of “blanking period” influenced the observed stroke rates (overall, the ischemic stroke rate was 5.4% with no quarantine period, 3.0% with 1-week blanking period, and 2.8% with a 4-week quarantine, which was ultimately used in that study) [66]. Although registry-based studies generally require a quarantine period (during which the events are not counted) to achieve a stable population for long-term follow-up, there is no room for a quarantine period in clinical practice, because the decision to use OAC should be made immediately upon the documentation of AF rather than several weeks or months later.

The event rate in the aforementioned study by Friberg et al. was also influenced by the definition of thromboembolic outcome—it was doubled (from 0.5 to 0.7% to 1.3%) when the outcome of ischemic stroke was combined with non-specified stroke, transient ischemic attack (TIA), or systemic embolism [66]. As a result of a low annual rate of ischemic stroke among AF patients with a single additional stroke risk factor in that study (0.5%) the authors questioned the benefit of OAC in such patients, thus neglecting the importance of reducing other AF-related outcomes such as mortality or systemic embolism. In addition, the study has been criticized for “conditioning on the future”, since all patients ever given OAC (including the follow-up) were excluded from the analysis, thus introducing a potential selection bias leading to low event rates. The same methodological flaws with “conditioning on the future” were apparent in the paper by Aspberg et al. [41]. A more appropriate assessment should be censoring on OAC initiation, as reported by Nielsen et al. [69].

In another study, which used the outcome of ischemic stroke or systemic embolism, the rates of ischemic stroke and mortality in untreated AF patients without additional stroke risk factors were 0.43% and 3.87%, respectively, whilst the rates in untreated AF patients with a single additional stroke risk factor were 1.50% (ischemic stroke) and 11.30% (death) [57]. Thus, the presence of a single additional stroke risk factor in non-anticoagulated AF patients was associated with a threefold increase in 1-year risk of stroke [hazard ratio (HR) 3.8; 95% confidence interval (CI), 2.61–5.63] and a threefold increase in the risk of death (HR 3.23; 95% CI 2.87–3.63) in comparison to untreated AF patients without additional stroke risk factors [57].

A meta-analysis of seven large observational studies yielded a 1.61% (0.00–3.23%) annual risk of ischemic stroke in AF patients with one additional stroke risk factor [53], which is slightly below the 1.7% annual stroke risk threshold for the use of VKAs but well above the 0.9% cutoff for NOACs use [13]. The wide confidence interval in that meta-analysis resulted from considerable heterogeneity among the studies. Nevertheless, removing the study with the highest annual stroke risk of 6.60% from the meta-analysis [58] still yielded a 0.87% (0.28–1.46%) annual stroke risk, which was still around the threshold for NOACs use.

Several studies showed that different stroke risk factors within the CHA2DS2-VASc score carry different weight with respect to stroke rates [35, 58, 65, 70] (Table 4). In the study from Taiwan, for example, stroke rates among AF patients with a single additional stroke risk factor varied from 1.91% with hypertension to 3.34% with age of 65–74 years [65]. In all studies, age has been consistently identified as the most powerful single stroke risk factor among AF patients with a CHA2DS2-VASc score of 1 (men) or 2 (women). In Asian AF patients, the risk of stroke substantially increased after the age of 50 years [71], and in a recent nationwide cohort study of non-anticoagulated Taiwanese AF patients with a single stroke risk factor in addition to sex, even an age of 20–49 years was associated with an annual rate of stroke of 1.33% [72] (which is above the tipping point for NOACs use).

Table 4 Examples of ischemic stroke rates observed in non-coagulated AF patients in registry-based studies

Overall, despite some heterogeneity in the statistical significance of the relationships of various stroke risk factors with observed stroke rates, the presence of a single CHA2DS2-VASc stroke risk factor was associated with increased risk of ischemic stroke or a composite outcome of stroke/systemic embolism/death in observational AF cohort studies (Table 4). Notwithstanding the discrepancies in methodology, coagulation status, and the characteristics of various AF populations, the annual rates of stroke in AF patients with a single additional stroke risk factor were around or well above the contemporary threshold for the initiation of OAC therapy, either with well-controlled VKAs or, preferably, with NOACs. Improvements in the management of OAC therapy and better safety profile of available OAC treatments coupled with the data from contemporary AF cohorts facilitate the use of OAC in this subset of AF patients.

Effectiveness of OAC in AF Patients with a Single Additional Stroke Risk Factor in “Real-World” Observational Studies

In the aforementioned Danish nationwide AF cohort study, the use of warfarin in AF patients with a single additional stroke risk factor reduced their stroke risk to the level comparable to AF patients with no additional stroke risk factors (HR 1.33; 95% CI 0.76–2.33) [57]. Also, the use of warfarin in AF patients with a single additional stroke risk factor was associated with reduction in stroke compared to aspirin (HR 0.67; 95% CI 0.43–1.06) or no therapy (HR 0.69; 95% CI 0.48–0.99), and there was no increase in the bleeding rates with warfarin relative to aspirin (HR 1.06; 95% CI 0.78–1.43). The use of OAC was also associated with significant reduction in mortality in comparison to the use of aspirin (HR 0.68; 95% CI 0.55–0.83) or no antithrombotic therapy (HR 0.42; 95% CI 0.36–0.50) in these patients [57].

In a recent hospital registry-based study in France, the use of OAC among AF patients with no additional stroke risk factors was associated with non-significant reduction in a composite outcome of stroke, systemic embolism, or death compared to no OAC therapy (HR 0.68; 95% CI 0.35–1.31, p = 0.25). However, the use of OAC in AF patients with a single additional stroke risk factor was clearly associated with significant advantage in terms of reduction in stroke, systemic embolism, or death compared with no OAC therapy (HR 0.59; 95% CI 0.40–0.86, p = 0.007) [70].

The evaluation of ultimate benefit of OAC therapy in AF patients requires consideration of both efficacy and safety of oral anticoagulant drugs, since these drugs inevitably exert an impact on hemostasis, thus increasing the risk of bleeding events. The concept of net clinical benefit (NCB) from OAC therapy represents a balance of the prevention of AF-related thromboembolic events against the risk of bleeding with OAC, whereby the most serious bleeding events (essentially ICH) have a greater weight than ischemic strokes because of worse outcome in terms of higher fatality or extensive permanent disability associated with the former. The first quantitative assessment of such weighted NCB with warfarin treatment in AF patients used the following formula: the annual rate of ischemic stroke or systemic embolism prevented by warfarin minus the rate of OAC-related ICH multiplied by a factor of 1.5, which was chosen arbitrarily [73]. A similar concept and other approaches have been subsequently used to estimate the NCB of NOACs or VKAs or other antithrombotic therapies in patients with AF [74,75,76,77,78].

An NCB analysis among AF patients with a CHA2DS2-VASc score of ≥1 in the Danish nationwide AF cohort yielded a neutral or positive net clinical benefit with VKA therapy (the NCB with warfarin in patients with a CHA2DS2-VASc score of 1 was neutral: 0.25 [95% CI −0.86 to 1.36]) [55] and a positive net clinical benefit with NOACs (in patients with a CHA2DS2-VASc score of 1 the NCB with dabigatran 150 mg twice daily, for example, was 1.36 [95% CI 0.86–2.58]) [54].

In the study of a community-based cohort of unselected AF patients with one non-gender-related additional stroke risk factor [70], Fauchier et al. reported that the use of VKA was associated with positive NCB in comparison to no therapy (NCB 0.30; 95% CI 0.15–0.61, or 1.42; 95% CI 1.01–1.99, using the NCB calculation method proposed by Singer et al. [73] or Connolly et al. [79], respectively), or aspirin (NCB 0.43; 95% CI 0.24–0.78, or 2.14; 95% CI 1.62–2.82), whilst the use of aspirin yielded a negative NCB when compared to no therapy (NCB −0.13; 95% CI −1.06 to −0.02, or −0.72; 95% CI −1.50 to −0.34) [70]. The Swedish AF cohort study also showed that the use of OAC (VKAs) was associated with a positive NCB in almost all AF patients, excluding only those with a CHA2DS2-VASc score of 0 at moderately increased risk of bleeding; of note, in AF patients with a CHA2DS2-VASc score of 1, the NCB was 0.30; 95% CI 0.10–0.40 [56].

Although VKA therapy is generally very well managed in Sweden in comparison to many other countries [80], the treatment effect of NOACs compared with VKA for the prevention of stroke and systemic embolism seems consistent regardless of cTTR [81], related to the observation that there is no significant change in rate of major bleedings across center average TTR and individual TTR quartiles [82, 83]. Moreover, well-managed VKA therapy with a TTR of ≥70% is often difficult to maintain in clinical practice, as shown in many registry-based studies [84, 85] or even in randomized clinical trials [86]. Even though the median TTR was 73%, a TTR of ≥70% was achieved in only 55% of the patients [87]. Thus, NOACs may be preferred over VKAs in many clinical circumstances, provided that a good adherence to therapy can be accomplished. An increasing body of evidence suggests that the performance of NOACs in real-world settings is broadly similar to their efficacy and safety in the respective randomized clinical trial [88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106].

The outcomes of VKA therapy are highly dependent on the quality of anticoagulation, as measured by individual patient’s TTR [84], and OAC-naive patients are particularly vulnerable to OAC-related serious adverse events (both thromboembolism and bleeding) in the first months of treatment, during the OAC inception period [107]. The SAMe-TT2R2 score, assigning 1 point each to female sex, age of <60 years, history of two or more comorbidities (i.e., hypertension, diabetes mellitus, coronary artery disease/myocardial infarction, peripheral arterial disease, congestive heart failure, previous stroke, pulmonary disease, and hepatic or renal disease) and treatment with drugs interacting with VKAs (e.g., amiodarone) and 2 points each for current or recent tobacco use and non-Caucasian ethnicity, has been shown to have reasonably good predictive ability to identify OAC-naive AF patients who would do well on VKAs (patients with a SAMe-TT2R2 score of 0–2), whilst those with a SAMe-TT2R2 score of >2 should be prescribed a NOAC [108]. The SAMe-TT2R2 score has been shown to be predictive not only of the quality of anticoagulation with VKAs but also of all-cause mortality and composite endpoint of thromboembolic events, major bleeding, and mortality [109,110,111,112,113,114].

Given the superior safety of NOACs relative to VKAs in terms of reduced risk of ICH and the advantage of more convenient use in comparison to VKAs, NOACs should be considered as the first-line antithrombotic treatment option in most patients with AF and a single additional stroke risk factor. Alternatively, particularly in case of a restricted reimbursement policy, the choice between NOACs and VKAs could be guided by the SAMe-TT2R2 score. Thus, patients with a SAMe-TT2R2 score of 0–2 would potentially do well on VKAs, whilst those with a SAMe-TT2R2 score of >2 are less likely to do well on VKA, unless additional measures such as more regular/frequent follow-up, INR checks and counselling should be given, or these days, a NOAC.

Patient Values and Preferences

Although OAC therapy achieves the greatest absolute reduction of stroke risk in AF patients at highest risk of stroke, AF patients at “low to moderate” risk of stroke still have a clear positive net clinical benefit from OAC therapy (particularly with NOACs). Nevertheless, in a recent Canadian combined survey on physicians and AF patients [115] the fear of OAC-related major bleeding complications has been ranked as the highest-priority OAC therapy-related consideration by physicians, whilst it was placed at only 5th position by patients (of note, patients were more concerned with interactions of OAC with food and drugs, possibility of rapid reversal of OAC effect in emergency situations, the clinical experience with particular OAC, or the requirement for regular blood testing). Another study using an iPad-facilitated questionnaire revealed that AF patients were willing to suffer four major bleeding events in exchange for preventing just one stroke and, in their view, the treatment threshold for the acceptance of OAC therapy was a minimum absolute stroke risk reduction of 0.8% per year (a number needed to treat 125 AF patients) [116].

Moreover, all health-related quality of life (QoL) scores (SF36) in a recent study were found to be significantly lower in warfarin-treated versus the NOAC-treated patients, which may be explained by the higher bleeding rates and hospital admissions while on warfarin treatment [117].

Recent survey among European electrophysiology centers showed that practicing European cardiologists were spending a considerable amount of time discussing individual risk profiles and available therapies with their AF patients [118]. In a randomized trial, educational interventions in AF patients resulted in improved quality of oral anticoagulation with VKAs [119]. Engaging AF patients in the informed shared decision-making about OAC therapy (either VKAs or NOACs) facilitates their understanding of treatment and helps in eliciting (and correcting) their possible misperceptions or personal barriers to OAC treatment, thus improving their adherence to therapy and ultimate treatment effects [120, 121]. A questionnaire-based tool facilitating the identification of patients’ values and preferences and supporting the decision regarding the use of VKAs or NOACs has been described [122].

Importantly, the individual risk profile of AF patients initially presenting as a “borderline” or “moderate” risk category may change over time [123]. Regular clinical follow-up of these patients and periodical re-assessment of individual patient risk profile are mandated, since aging and/or development of cardiac, renal, or other comorbidities may aggravate the patient’s stroke or bleeding risk [124], which could sometimes require adjustments in OAC treatment with respect to the choice of oral anticoagulant drug and appropriate dosing [125].

Conclusion

Patients with non-valvular AF and a single additional stroke risk factor may be denied OAC because of the misperception that their risk of stroke is not sufficiently high to justify the use of OAC (either VKAs or NOACs). Observational data from real-world AF cohorts show that the annual stroke rates in such patients are higher than in patients with no additional stroke risk factors and are around or well above the contemporary threshold for OAC treatment. Well-controlled VKA or NOACs therapy in these patients has been associated with a positive net clinical benefit owing to reduction in the risk of stroke, systemic embolism, or death in comparison to no therapy or aspirin, with no increase in the risk of bleeding relative to aspirin.

Given the superior safety and convenience of NOACs relative to VKAs, NOACs should be considered as the first-line antithrombotic treatment option in most AF patients with a single additional stroke risk factor. Regular clinical follow-up and periodical re-assessment of individual patient risk profile are mandated, since aging and/or development of cardiac, renal, or other comorbidities may aggravate the patient’s stroke or bleeding risk.