We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The chapter outlines key principles in Cognitive CDA, which inherits its social theory from CDA and from cognitive linguistics inherits a particular view of language and a framework for analysing language (as well as other semiotic modes). In connection with CDA, the chapter describes the dialectical relationship conceived between discourse and society. Key concepts relating to the dialogicality of discourse are also introduced, namely intertextuality and interdiscursivity. The central role of discourse in maintaining power and inequality is described with a focus on the ideological and legitimating functions of language and conceptualisation. In connection with cognitive linguistics, the chapter describes the non-autonomous nature of language, the continuity between grammar and the lexicon and the experiential grounding of language. The key concept of construal and its implications for ideology in language and conceptualisation are discussed. A framework in which construal operations are related to discursive strategies and domain-general cognitive systems and processes is set out. The chapter closes by briefly introducing the main models and methods of Cognitive CDA.
Test educational interventions to increase the quality of care in telemedicine.
Background:
Telemedicine (TM) has become an essential tool to practise medicine around the world. However, education to address clinical skills in TM remains an area of need globally across the health professions. We aim to evaluate the impact of a pilot online learning platform (OLP) and standardized coaching programme on the quality of medical student TM clinical skills.
Methods:
A randomized pilot study was conducted with fourth-year medical students (n = 12). All participants engaged in video-recorded standardized patient (SP) simulated encounters to assess TM clinical skills before and after the intervention. Participants were randomized to either the OLP or OLP + Virtual Coaching Institute (VCI) intervention cohort. Quantitative and qualitative data were collected to address self-reported skills, attitudes, and self-efficacy before the 1st SP encounter and after the 2nd SP encounter. SP encounter recordings were scored by two blinded non-investigator raters based on a standardized rubric to measure the change in TM care delivered pre- and post-intervention. Statistical analysis of quantitative data included descriptive statistics and mixed effects ANOVA.
Findings:
Recruitment and retention of participants exceeded expectations, pointing to significant enthusiasm for this educational opportunity. Self-reported skills and scored simulation skills demonstrated significant improvements for all participants receiving the interventions. Both OLP and VCI interventions were well received, feasible, and demonstrated statistically significant efficacy in improving TM clinical skills. Participants who received coaching described more improvements in self-efficacy, confidence, and overall virtual clinical skills. This study provides evidence that virtualized clinical learning environments can positively impact the development of TM clinical skills among medical students. As TM continues to evolve, the implementation of innovative training approaches will be crucial in preparing the next generation of healthcare professionals for the demands of modern healthcare delivery.
This paper proposes an ordinal generalization of the hierarchical classes model originally proposed by De Boeck and Rosenberg (1998). Any hierarchical classes model implies a decomposition of a two-way two-mode binary array M into two component matrices, called bundle matrices, which represent the association relation and the set-theoretical relations among the elements of both modes in M. Whereas the original model restricts the bundle matrices to be binary, the ordinal hierarchical classes model assumes that the bundles are ordinal variables with a prespecified number of values. This generalization results in a classification model with classes ordered along ordinal dimensions. The ordinal hierarchical classes model is shown to subsume Coombs and Kao's (1955) model for nonmetric factor analysis. An algorithm is described to fit the model to a given data set and is subsequently evaluated in an extensive simulation study. An application of the model to student housing data is discussed.
The Vale–Maurelli (VM) approach to generating non-normal multivariate data involves the use of Fleishman polynomials applied to an underlying Gaussian random vector. This method has been extensively used in Monte Carlo studies during the last three decades to investigate the finite-sample performance of estimators under non-Gaussian conditions. The validity of conclusions drawn from these studies clearly depends on the range of distributions obtainable with the VM method. We deduce the distribution and the copula for a vector generated by a generalized VM transformation, and show that it is fundamentally linked to the underlying Gaussian distribution and copula. In the process we derive the distribution of the Fleishman polynomial in full generality. While data generated with the VM approach appears to be highly non-normal, its truly multivariate properties are close to the Gaussian case. A Monte Carlo study illustrates that generating data with a different copula than that implied by the VM approach severely weakens the performance of normal-theory based ML estimates.
Six different algorithms to generate widely different non-normal distributions are reviewed. These algorithms are compared in terms of speed, simplicity and generality of the technique. The advantages and disadvantages of using these algorithms are briefly discussed.
The use of p-values in combining the results of independent studies often involves studies that are potentially aberrant either in quality or in actual values. A robust data analysis suggests the use of a statistic that takes these aberrations into account by trimming some of the largest and smallest p-values. We present a trimmed statistic based on an inverse cumulative normal transformation of the ordered p-values, together with a simple and convenient method for approximating the distribution and first two moments of this statistic.
We give an account of Classical Test Theory (CTT) in terms of the more fundamental ideas of Item Response Theory (IRT). This approach views classical test theory as a very general version of IRT, and the commonly used IRT models as detailed elaborations of CTT for special purposes. We then use this approach to CTT to derive some general results regarding the prediction of the true-score of a test from an observed score on that test as well from an observed score on a different test. This leads us to a new view of linking tests that were not developed to be linked to each other. In addition we propose true-score prediction analogues of the Dorans and Holland measures of the population sensitivity of test linking functions. We illustrate the accuracy of the first-order theory using simulated data from the Rasch model, and illustrate the effect of population differences using a set of real data.
This paper presents an analysis, based on simulation, of the stability of principal components. Stability is measured by the expectation of the absolute inner product of the sample principal component with the corresponding population component. A multiple regression model to predict stability is devised, calibrated, and tested using simulated Normal data. Results show that the model can provide useful predictions of individual principal component stability when working with correlation matrices. Further, the predictive validity of the model is tested against data simulated from three non-Normal distributions. The model predicted very well even when the data departed from normality, thus giving robustness to the proposed measure. Used in conjunction with other existing rules this measure will help the user in determining interpretability of principal components.
The jackknife by groups and modifications of the jackknife by groups are used to estimate standard errors of rotated factor loadings for selected populations in common factor model maximum likelihood factor analysis. Simulations are performed in which t-statistics based upon these jackknife estimates of the standard errors are computed. The validity of the t-statistics and their associated confidence intervals is assessed. Methods are given through which the computational efficiency of the jackknife may be greatly enhanced in the factor analysis model.
A standard approach for handling ordinal data in covariance analysis such as structural equation modeling is to assume that the data were produced by discretizing a multivariate normal vector. Recently, concern has been raised that this approach may be less robust to violation of the normality assumption than previously reported. We propose a new perspective for studying the robustness toward distributional misspecification in ordinal models using a class of non-normal ordinal covariance models. We show how to simulate data from such models, and our simulation results indicate that standard methodology is sensitive to violation of normality. This emphasizes the importance of testing distributional assumptions in empirical studies. We include simulation results on the performance of such tests.
An approach to generate non-normality in multivariate data based on a structural model with normally distributed latent variables is presented. The key idea is to create non-normality in the manifest variables by applying non-linear linking functions to the latent part, the error part, or both. The algorithm corrects the covariance matrix for the applied function by approximating the deviance using an approximated normal variable. We show that the root mean square error (RMSE) for the covariance matrix converges to zero as sample size increases and closely approximates the RMSE as obtained when generating normally distributed variables. Our algorithm creates non-normality affecting every moment, is computationally undemanding, easy to apply, and particularly useful for simulation studies in structural equation modeling.
This paper considers a multivariate normal model with one of the component variables observable only in polytomous form. The maximum likelihood approach is used for estimation of the parameters in the model. The Newton-Raphson algorithm is implemented to obtain the solution of the problem. Examples based on real and simulated data are reported.
The Non-Equivalent groups with Anchor Test (NEAT) design involves missingdata that are missing by design. Three nonlinear observed score equating methods used with a NEAT design are the frequency estimation equipercentile equating (FEEE), the chain equipercentile equating (CEE), and the item-response-theory observed-score-equating (IRT OSE). These three methods each make different assumptions about the missing data in the NEAT design. The FEEE method assumes that the conditional distribution of the test score given the anchor test score is the same in the two examinee groups. The CEE method assumes that the equipercentile functions equating the test score to the anchor test score are the same in the two examinee groups. The IRT OSE method assumes that the IRT model employed fits the data adequately, and the items in the tests and the anchor test do not exhibit differential item functioning across the two examinee groups. This paper first describes the missing data assumptions of the three equating methods. Then it describes how the missing data in the NEAT design can be filled in a manner that is coherent with the assumptions made by each of these equating methods. Implications on equating are also discussed.
This chapter introduces detailed mathematical modelling for diffusion-based molecular communication systems. Mathematical and physical aspects of diffusion are covered, such as the Wiener process, drift, first arrival time distributions, the effect of concentration, and Fick’s laws. Simulation of molecular communication systems is also discussed.
This chapter introduces detailed mathematical modelling for biological molecular communication systems, particularly for ligand–receptor systems. Models for chemical kinetics are introduced, including the master equation, and these are applied to membrane ion channels. Simulation of these systems is also discussed.
We model the slip length tribometer (SLT), originally presented by Pelz et al. (J. Fluid Mech., vol. 948, 2022, p. A8) in OpenFOAM. The plate tribometer is especially designed to simultaneously measure viscosity and slip length for lubrication gaps in the range of approximately 10 $\mathrm {\mu }$m at temperatures and surface roughnesses relevant to technical applications, with a temperature range of $-30$ to $100\,^\circ \mathrm {C}$ and surface roughness ranging from $10\ \mathrm {nm}$ to $1\ \mathrm {\mu }\mathrm {m}$. A simplified analytical model presented by Pelz et al. (J. Fluid Mech., vol. 948, 2022, p. A8) infers the slip length of the plate from the experimentally measured torque and the plate gap height. The present work verifies the analytical model using axisymmetric flow simulations and presents the effect of inlet on the numerical velocity profiles. The simulation results are in very good agreement with the results of the analytical model. The main conclusion drawn from this study is the validation of the Navier-slip boundary condition as an effective model for partial slip in computational fluid dynamics simulations and the negligible influence of the inlet on the fluid flow between the SLT's plates.
Anthrax is a bacterial zoonotic disease caused by Bacillus anthracis. We qualitatively examined facilitators and barriers to responding to a potential anthrax outbreak using the capability, opportunity, motivation behaviour model (COM-B model) in the high-risk rural district of Namisindwa, in Eastern Uganda. We chose the COM-B model because it provides a systematic approach for selecting evidence-based techniques and approaches for promoting the behavioural prompt response to anthrax outbreaks. Unpacking these facilitators and barriers enables the leaders and community members to understand existing resources and gaps so that they can leverage them for future anthrax outbreaks.
This was a qualitative cross-sectional study that was part of a bigger anthrax outbreak simulation study conducted in September 2023. We conducted 10 Key Informant interviews among key stakeholders. The interviews were audio recorded on Android-enabled phones and later transcribed verbatim. The transcripts were analyzed using a deductive thematic content approach through Nvivo 12.
The facilitators were; knowledge of respondents about anthrax disease and anthrax outbreak response, experience and presence of surveillance guidelines, availability of resources, and presence of communication channels. The identified barriers were; porous boarders that facilitate unregulated animal trade across, lack of essential personal protective equipment, and lack of funds for surveillance and response activities.
Generally, the district was partially ready for the next anthrax outbreak. The district was resourced in terms of human resources but lacked adequate funds for animal, environmental and human surveillance activities for anthrax and related response. The district technical staff had the knowledge required to respond to the anthrax outbreak but lacked adequate funds for animal, environmental and human surveillance for anthrax and related response. We think that our study findings are generalizable in similar settings and therefore call for the implementation of such periodic evaluations to help leverage the strong areas and improve other aspects. Anthrax is a growing threat in the region, and there should be proactive efforts in prevention, specifically, we recommend vaccination of livestock and further research for human vaccines.
Conceptual models of smectite hydration include planar (flat) clay layers that undergo stepwise expansion as successive monolayers of water molecules fill the interlayer regions. However, X-ray diffraction (XRD) studies indicate the presence of interstratified hydration states, suggesting non-uniform interlayer hydration in smectites. Additionally, recent theoretical studies have shown that clay layers can adopt bent configurations over nanometer-scale lateral dimensions with minimal effect on mechanical properties. Therefore, in this study we used molecular simulations to evaluate structural properties and water adsorption isotherms for montmorillonite models composed of bent clay layers in mixed hydration states. Results are compared with models consisting of planar clay layers with interstratified hydration states (e.g. 1W–2W). The small degree of bending in these models (up to 1.5 Å of vertical displacement over a 1.3 nm lateral dimension) had little or no effect on bond lengths and angle distributions within the clay layers. Except for models that included dry states, porosities and simulated water adsorption isotherms were nearly identical for bent or flat clay layers with the same averaged layer spacing. Similar agreement was seen with Na- and Ca-exchanged clays. While the small bent models did not retain their configurations during unconstrained molecular dynamics simulation with flexible clay layers, we show that bent structures are stable at much larger length scales by simulating a 41.6×7.1 nm2 system that included dehydrated and hydrated regions in the same interlayer.
Using tools from computable analysis, we develop a notion of effectiveness for general dynamical systems as those group actions on arbitrary spaces that contain a computable representative in their topological conjugacy class. Most natural systems one can think of are effective in this sense, including some group rotations, affine actions on the torus and finitely presented algebraic actions. We show that for finitely generated and recursively presented groups, every effective dynamical system is the topological factor of a computable action on an effectively closed subset of the Cantor space. We then apply this result to extend the simulation results available in the literature beyond zero-dimensional spaces. In particular, we show that for a large class of groups, many of these natural actions are topological factors of subshifts of finite type.