Goto

Collaborating Authors

Ophthalmology/Optometry


Simple Eye Exam, Coupled With AI, May Aid Early Parkinson's Diagnosis

#artificialintelligence

A simple eye exam, coupled with artificial intelligence (AI) technology, could improve the diagnosis of neurodegenerative disorders, according to …


AI and Telemedicine for the Retina Specialist

#artificialintelligence

We also review Notal Vision's ForseeHome device and the company's Home OCT AI-enable platform for monitoring AMD. Kester Nahen, PhD, is the chief executive officer at Notal Vision. We'd love to hear from you! Send your comments/questions to Dr. Mali at eyecareinsider@healio.com.


Engineering near-infrared vision

Science

CATEGORY WINNER: MOLECULAR MEDICINE Dasha Nelidova Dasha Nelidova completed her undergraduate degrees at the University of Auckland, New Zealand. She completed her Ph.D. in neurobiology at the Friedrich Miescher Institute for Biomedical Research in Basel, Switzerland. Nelidova is currently a postdoctoral researcher at the Institute of Molecular and Clinical Ophthalmology Basel, where she is working to develop new translational technologies for treating retinal diseases that lead to blindness. [ www.sciencemag.org/content/370/6519/925.2 ][1] Photoreceptor degeneration, including age-related macular degeneration and retinitis pigmentosa, is a leading cause of blindness worldwide. Repair of retinal neurons by optogenetics—a technology that sensitizes neurons to light through the transfer of genes for light-sensitive proteins of microbial origin ([ 1 ][2], [ 2 ][3])—has entered clinical trials ([ 3 ][4], [ 4 ][5]). Trials began in 2018 in patients with advanced retinitis pigmentosa and minimal remaining vision ([ 4 ][5]). Optogenetic proteins are sensitive only to the brightest visible light, at intensities that overwhelm surviving functional photoreceptors. Yet, in a number of blinding diseases, light-sensitive and light-insensitive photoreceptor zones coexist within the same retina. In macular degeneration, for example, cone photoreceptors of the central retina lose their light sensitivity. Surrounding photoreceptors remain viable, and peripheral vision is largely unaffected. A key challenge for new translational technologies that aim to restore image-acquiring properties of the retina is the compatibility of such technologies with remaining vision. We reasoned that sensitizing the retina to wavelengths that functional photoreceptors are unable to detect (>900 nm) could supplement deteriorating natural vision, without interfering with the ability to see the visible spectrum. Inspired by infrared vision in snakes, we developed nanogenetic molecular tools that allowed blind mice and ex vivo human retinas to detect near-infrared (NIR) light ([ 5 ][6]). Snakes can see the world in two different ways. Like humans, they make use of their eyes to detect wavelengths of the visible spectrum (400 to 700 nm). In addition, several species can also generate thermal images ([ 6 ][7]). Snakes detect infrared light (1 to 30 μm) using temperature-sensitive transient receptor potential (TRP) cation channels expressed in a specialized “pit” organ ([ 6 ][7]). Infrared and visible spectrum images superimpose within the brain ([ 7 ][8]), presumably enabling the animals to react to the environment with greater precision than what is possible by using only a single image. Snakes can switch back and forth between the two imaging systems or use both simultaneously ([ 7 ][8], [ 8 ][9]). TRP channels could potentially be targeted to mammalian retinal cell types to make them sensitive to infrared radiation. However, infrared light would raise vibrational energies of water molecules throughout the eye. Shorter wavelength NIR light would be preferable because NIR has lower water absorption, although this same feature also makes direct NIR illumination an inefficient activator of TRP channels. To develop a more efficient NIR light detector for retinal cell types, we engineered a dual system that consists of a genetic and a nanomaterial component (see the figure). The genetic half of the sensor consists of TRP channels, engineered to incorporate an extracellular protein epitope tag recognizable by a specific antibody ([ 9 ][10]). The nanomaterial half of the sensor consists of gold nanorods conjugated to an antibody against the epitope ([ 10 ][11]). Gold nanorods serve as antennas for NIR light and convert light into local heat through surface plasmon resonance ([ 11 ][12]), driving photocurrents through antibody-bound TRP channels. Subretinal microinjection of virally packaged TRP and nanorods delivered the sensor components to cones. Our initial system was based on TRP vanilloid 1 (TRPV1) channels and gold nanorods with absorption maxima at 915 nm. We began by inserting a 6x-His epitope tag into the middle of the first TRPV1 extracellular loop, measuring sizes of evoked currents before and after the modification, and confirmed that channels remained functional. Next, we used adeno-associated virus (AAV)–mediated gene transfer to transduce cone photoreceptors of blind mice with the nanogenetic sensor. To measure neural activity, we performed two-photon calcium imaging of individual neurons within the retina and primary visual cortex. Expression of the nanogenetic sensor in cones rendered blind retinas to be sensitive to NIR light. Cone photoreceptors (retinal input) and retinal ganglion cells (retinal output) responded vigorously to 915-nm light, and NIR-evoked retinal activity propagated to the brain. This allowed treated mice to use their newly acquired NIR vision to perform behavioral tasks. In complementary experiments, we confirmed that NIR light was unable to activate wild-type cones and did not affect their visible light responses. Similarly, awake, wildtype mice failed to exploit NIR light cues during behavioral training. Nanorod properties depend on size and shape ([ 11 ][12]). By changing the length of the gold nanorods from ∼80 nm to ∼120 nm, we tuned NIR vision to a different NIR wavelength (980 nm). Wavelength tuning is important for several reasons. Certain NIR wavelengths might be better tolerated by patients than others. Also, maximum permissible light doses for the human eye depend on the wavelength. Additionally, NIR vision requires eye goggles that project images composed of specific NIR wavelengths onto the retina. Compatibility with current and future NIR projectors requires tunable NIR detectors. Across the animal kingdom, multiple variants of thermosensitive proteins can be found, and more can be created through mutagenesis. Channels, tags, and antibodies can be modified to gain additional desirable properties. We selected TRP ankyrin 1 (TRPA1) channels from the Texas rat snake because of their lower thermal thresholds and inserted the newer epitope tag OLLAS ( Escherichia coli OmpF Linker and mouse Langerin fusion sequence) ([ 12 ][13]) into the first extracellular loop. Mice transfected with engineered TRPA1 channels were better able to anticipate water rewards when lights were dimmed as compared with mice transfected with TRPV1, indicating an improvement in the sensitivity of the sensor. (Both TRPA1- and TRPV1-transduced animals performed behavioral tasks as well as wild-type animals that were trained by using visible light.) The next step was to validate findings in blind human retinas. To do this, we targeted TRPV1 and gold nanorods to light-insensitive photoreceptors of adult human ex vivo retinal explants. (We had previously developed a cocktail of molecules to keep human retinas alive for 8 weeks post mortem, giving gene expression time to take hold.) We then recorded NIR light–evoked calcium activity and saw fast, strong activation of human photoreceptors and downstream retinal neurons, including ganglion cells. Taken together, these experiments provide proof of principle for the potential therapeutic translation of this technology. Light intensities required to drive genetically encoded NIR sensors met existing safety standards that specify exposure limits for the human eye, and we further demonstrated that components of the sensor may be exchanged, with predictable final outcomes. In the future, targeted central repair would allow an island of NIR sensitivity to be built in a sea of natural vision. Parallel developments in surgery ([ 13 ][14]) and NIR projectors with eye-tracking capabilities ([ 4 ][5]) make targeted central repair feasible. Ultimately, the user may be able to self-select the region of the electromagnetic spectrum most useful to view the external world, a decision guided by the state of their retina and ambient light conditions. 1. [↵][15]1. J. A. Sahel, 2. B. Roska , Annu. Rev. Neurosci. 36, 467 (2013). [OpenUrl][16][CrossRef][17][PubMed][18][Web of Science][19] 2. [↵][20]1. V. Busskamp et al ., Science 329, 413 (2010). [OpenUrl][21][Abstract/FREE Full Text][22] 3. [↵][23]1. S. Makin , Nat. Outlook 10.1038/d41586-019-01107-8 (2019). 4. [↵][24]Dose-escalation Study to Evaluate the Safety and Tolerability of GS030 in Subjects With Retinitis Pigmentosa (PIONEER). Clinical Trials ID: NCT03326336 (2018). 5. [↵][25]1. D. Nelidova et al ., Science 368, 1108 (2020). [OpenUrl][26][Abstract/FREE Full Text][27] 6. [↵][28]1. E. O. Gracheva et al ., Nature 464, 1006 (2010). [OpenUrl][29][CrossRef][30][PubMed][31][Web of Science][32] 7. [↵][33]1. E. A. Newman, 2. P. H. Hartline , Science 213, 789 (1981). [OpenUrl][34][Abstract/FREE Full Text][35] 8. [↵][36]1. E. A. Newman, 2. P. H. Hartline , Sci. Am. 246, 116 (1982). [OpenUrl][37] 9. [↵][38]1. S. A. Stanley et al ., Science 336, 604 (2012). [OpenUrl][39][Abstract/FREE Full Text][40] 10. [↵][41]1. P. P. Joshi, 2. S. J. Yoon, 3. W. G. Hardin, 4. S. Emelianov, 5. K. V. Sokolov , Bioconjug. Chem. 24, 878 (2013). [OpenUrl][42][CrossRef][43][PubMed][44] 11. [↵][45]1. Z. Qin, 2. J. C. Bischof , Chem. Soc. Rev. 41, 1191 (2012). [OpenUrl][46][CrossRef][47][PubMed][48] 12. [↵][49]1. S. H. Park et al ., J. Immunol. Methods 331, 27 (2008). [OpenUrl][50][CrossRef][51][PubMed][52][Web of Science][53] 13. [↵][54]1. A. M. Maguire et al ., N. Engl. J. Med. 358, 2240 (2008). [OpenUrl][55][CrossRef][56][PubMed][57][Web of Science][58] Acknowledgments: I thank my thesis adviser, B. Roska, and all molecular and clinical research colleagues at the Institute of Molecular and Clinical Ophthalmology Basel for their enthusiasm, advice, and help. I also thank our collaborators, especially A. Szabo, without whom human retinal experiments would not have been possible. [1]: http://www.sciencemag.org/content/370/6519/925.2 [2]: #ref-1 [3]: #ref-2 [4]: #ref-3 [5]: #ref-4 [6]: #ref-5 [7]: #ref-6 [8]: #ref-7 [9]: #ref-8 [10]: #ref-9 [11]: #ref-10 [12]: #ref-11 [13]: #ref-12 [14]: #ref-13 [15]: #xref-ref-1-1 "View reference 1 in text" [16]: {openurl}?query=rft.jtitle%253DAnnu.%2BRev.%2BNeurosci%26rft.volume%253D36%26rft.spage%253D467%26rft_id%253Dinfo%253Adoi%252F10.1146%252Fannurev-neuro-062012-170304%26rft_id%253Dinfo%253Apmid%252F23724995%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [17]: /lookup/external-ref?access_num=10.1146/annurev-neuro-062012-170304&link_type=DOI [18]: /lookup/external-ref?access_num=23724995&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [19]: /lookup/external-ref?access_num=000323892300021&link_type=ISI [20]: #xref-ref-2-1 "View reference 2 in text" [21]: {openurl}?query=rft.jtitle%253DScience%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.1190897%26rft_id%253Dinfo%253Apmid%252F20576849%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [22]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzMjkvNTk5MC80MTMiO3M6NDoiYXRvbSI7czoyNDoiL3NjaS8zNzAvNjUxOS85MjUuMi5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [23]: #xref-ref-3-1 "View reference 3 in text" [24]: #xref-ref-4-1 "View reference 4 in text" [25]: #xref-ref-5-1 "View reference 5 in text" [26]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DNelidova%26rft.auinit1%253DD.%26rft.volume%253D368%26rft.issue%253D6495%26rft.spage%253D1108%26rft.epage%253D1113%26rft.atitle%253DRestoring%2Blight%2Bsensitivity%2Busing%2Btunable%2Bnear-infrared%2Bsensors%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.aaz5887%26rft_id%253Dinfo%253Apmid%252F32499439%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [27]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEzOiIzNjgvNjQ5NS8xMTA4IjtzOjQ6ImF0b20iO3M6MjQ6Ii9zY2kvMzcwLzY1MTkvOTI1LjIuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9 [28]: #xref-ref-6-1 "View reference 6 in text" [29]: {openurl}?query=rft.jtitle%253DNature%26rft.stitle%253DNature%26rft.aulast%253DGracheva%26rft.auinit1%253DE.%2BO.%26rft.volume%253D464%26rft.issue%253D7291%26rft.spage%253D1006%26rft.epage%253D1011%26rft.atitle%253DMolecular%2Bbasis%2Bof%2Binfrared%2Bdetection%2Bby%2Bsnakes.%26rft_id%253Dinfo%253Adoi%252F10.1038%252Fnature08943%26rft_id%253Dinfo%253Apmid%252F20228791%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [30]: /lookup/external-ref?access_num=10.1038/nature08943&link_type=DOI [31]: /lookup/external-ref?access_num=20228791&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [32]: /lookup/external-ref?access_num=000276635000031&link_type=ISI [33]: #xref-ref-7-1 "View reference 7 in text" [34]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DNewman%26rft.auinit1%253DE.%26rft.volume%253D213%26rft.issue%253D4509%26rft.spage%253D789%26rft.epage%253D791%26rft.atitle%253DIntegration%2Bof%2Bvisual%2Band%2Binfrared%2Binformation%2Bin%2Bbimodal%2Bneurons%2Bin%2Bthe%2Brattlesnake%2Boptic%2Btectum%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.7256281%26rft_id%253Dinfo%253Apmid%252F7256281%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [35]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIyMTMvNDUwOS83ODkiO3M6NDoiYXRvbSI7czoyNDoiL3NjaS8zNzAvNjUxOS85MjUuMi5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [36]: #xref-ref-8-1 "View reference 8 in text" [37]: {openurl}?query=rft.jtitle%253DSci.%2BAm%26rft.volume%253D246%26rft.spage%253D116%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [38]: #xref-ref-9-1 "View reference 9 in text" [39]: {openurl}?query=rft.jtitle%253DScience%26rft.stitle%253DScience%26rft.aulast%253DStanley%26rft.auinit1%253DS.%2BA.%26rft.volume%253D336%26rft.issue%253D6081%26rft.spage%253D604%26rft.epage%253D608%26rft.atitle%253DRadio-Wave%2BHeating%2Bof%2BIron%2BOxide%2BNanoparticles%2BCan%2BRegulate%2BPlasma%2BGlucose%2Bin%2BMice%26rft_id%253Dinfo%253Adoi%252F10.1126%252Fscience.1216753%26rft_id%253Dinfo%253Apmid%252F22556257%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [40]: /lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6Mzoic2NpIjtzOjU6InJlc2lkIjtzOjEyOiIzMzYvNjA4MS82MDQiO3M6NDoiYXRvbSI7czoyNDoiL3NjaS8zNzAvNjUxOS85MjUuMi5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30= [41]: #xref-ref-10-1 "View reference 10 in text" [42]: {openurl}?query=rft.jtitle%253DBioconjug.%2BChem%26rft.volume%253D24%26rft.spage%253D878%26rft_id%253Dinfo%253Adoi%252F10.1021%252Fbc3004815%26rft_id%253Dinfo%253Apmid%252F23631707%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [43]: /lookup/external-ref?access_num=10.1021/bc3004815&link_type=DOI [44]: /lookup/external-ref?access_num=23631707&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [45]: #xref-ref-11-1 "View reference 11 in text" [46]: {openurl}?query=rft.stitle%253DChem%2BSoc%2BRev%26rft.aulast%253DQin%26rft.auinit1%253DZ.%26rft.volume%253D41%26rft.issue%253D3%26rft.spage%253D1191%26rft.epage%253D1217%26rft.atitle%253DThermophysical%2Band%2Bbiological%2Bresponses%2Bof%2Bgold%2Bnanoparticle%2Blaser%2Bheating.%26rft_id%253Dinfo%253Adoi%252F10.1039%252FC1CS15184C%26rft_id%253Dinfo%253Apmid%252F21947414%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [47]: /lookup/external-ref?access_num=10.1039/C1CS15184C&link_type=DOI [48]: /lookup/external-ref?access_num=21947414&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [49]: #xref-ref-12-1 "View reference 12 in text" [50]: {openurl}?query=rft.jtitle%253DJournal%2Bof%2Bimmunological%2Bmethods%26rft.stitle%253DJ%2BImmunol%2BMethods%26rft.aulast%253DPark%26rft.auinit1%253DS.%2BH.%26rft.volume%253D331%26rft.issue%253D1-2%26rft.spage%253D27%26rft.epage%253D38%26rft.atitle%253DGeneration%2Band%2Bapplication%2Bof%2Bnew%2Brat%2Bmonoclonal%2Bantibodies%2Bagainst%2Bsynthetic%2BFLAG%2Band%2BOLLAS%2Btags%2Bfor%2Bimproved%2Bimmunodetection.%26rft_id%253Dinfo%253Adoi%252F10.1016%252Fj.jim.2007.10.012%26rft_id%253Dinfo%253Apmid%252F18054954%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [51]: /lookup/external-ref?access_num=10.1016/j.jim.2007.10.012&link_type=DOI [52]: /lookup/external-ref?access_num=18054954&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [53]: /lookup/external-ref?access_num=000254059400003&link_type=ISI [54]: #xref-ref-13-1 "View reference 13 in text" [55]: {openurl}?query=rft.jtitle%253DNew%2BEngland%2BJournal%2Bof%2BMedicine%26rft.stitle%253DNEJM%26rft.volume%253D358%26rft.issue%253D21%26rft.spage%253D2240%26rft.epage%253D2248%26rft.atitle%253DSafety%2Band%2Befficacy%2Bof%2Bgene%2Btransfer%2Bfor%2BLeber%2527s%2Bcongenital%2Bamaurosis.%26rft_id%253Dinfo%253Adoi%252F10.1056%252FNEJMoa0802315%26rft_id%253Dinfo%253Apmid%252F18441370%26rft.genre%253Darticle%26rft_val_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Ajournal%26ctx_ver%253DZ39.88-2004%26url_ver%253DZ39.88-2004%26url_ctx_fmt%253Dinfo%253Aofi%252Ffmt%253Akev%253Amtx%253Actx [56]: /lookup/external-ref?access_num=10.1056/NEJMoa0802315&link_type=DOI [57]: /lookup/external-ref?access_num=18441370&link_type=MED&atom=%2Fsci%2F370%2F6519%2F925.2.atom [58]: /lookup/external-ref?access_num=000256023600006&link_type=ISI


Irish researcher develops AI to help prevent sight loss

#artificialintelligence

The ability to apply artificial intelligence (AI) to ophthalmology is gathering pace, a consequence of remarkable collaboration between eye specialists and technologists whose forte is the ability to process vast amounts of data quickly. Irish ophthalmologist Dr Pearse Keane – based in Moorfields Hospital, London – has been the chief catalyst in developing AI software to detect 50 sight-threatening eye diseases. It operates by interpreting optical coherence tomography (OCT) scans of the back of the eye, which soon will be routine when going for an eye check. Automation in analysing scans for diseases such as wet age-related macular degeneration (AMD), the main cause of blindness in Europe, and diabetic retinopathy, is about to revolutionise patient outcomes with faster results affording earlier diagnosis and prompt treatment, and ultimately preventing avoidable sight loss. Since that initial breakthrough, the Keane team has developed an alert system for a third of people with AMD who later get it in their good eye and, potentially, an early-warning system for onset of neurodegenerative diseases, notably Alzheimer's.


Diagnos' AI Platform to Fight Vision Loss Continues to Expand Globally

#artificialintelligence

Diagnos Inc. (TSXV: ADK OTCQB: DGNOF) is a software platform provider for the early detection of critical health issues through the use of Artificial Intelligence ("AI") and Machine Learning ("ML"), and it continues to expand globally with recent deals and government financing support. Utilizing AI and ML, Diagnos developed a Computer Assisted Retina Analysis ("CARA") software platform to process image data, making standard retinal images sharper, clearer, and easier to read. This technology assists healthcare specialists in the detection of vision loss, including diabetic retinopathy. CARA is a teleophthalmology platform, a branch of telemedicine, which delivers eye care through digital medical equipment and telecommunications technology. The CARA platform integrates with various types of existing retinal cameras at the point of care, is compatible with all recognized image formats, and is Electronic Medical Records ("EMR") compatible.


Microsoft's new AI auto-captions images for the visually impaired

#artificialintelligence

A new AI from Microsoft aims to automatically caption images in documents and emails so that software for visual impairments can read it out. Researchers from Microsoft explained their machine learning model in a paper on preprint repository arXiv. The model uses VIsual VOcabulary pre-training (VIVO) which leverages large amounts of paired image-tag data to learn a visual vocabulary. A second dataset of properly captioned images is then used to help teach the AI how to best describe the pictures. "Ideally, everyone would include alt text for all images in documents, on the web, in social media – as this enables people who are blind to access the content and participate in the conversation. But, alas, people don't," said Saqib Shaikh, a software engineering manager with Microsoft's AI platform group.


If we're serious about healthcare equity and access, we must support autonomous AI

#artificialintelligence

The current COVID-19 public health crisis has converged with the racial and socioeconomic injustices that plague our society, highlighting vast differences in health care access. Though the U.S. spends more on health care than any other developed nation, access to critical preventive care remains particularly challenging for many racial and ethnic minority populations, as well as lower-income and rural Americans. For many, visits to a specialist for a routine diagnostic exam represent a time-consuming and costly venture. As a result, many serious conditions go undiagnosed until they are advanced when treatment becomes more expensive and invasive, and outcomes are less favorable. In 2019 a record number of U.S. adults (33 percent) said they put off receiving medical care due to cost, according to Gallup's annual Health and Healthcare poll.


AI Can Help Diagnose Some Illnesses--If Your Country Is Rich

#artificialintelligence

Artificial intelligence promises to expertly diagnose disease in medical images and scans. However, a close look at the data used to train algorithms for diagnosing eye conditions suggests these powerful new tools may perpetuate health inequalities. A team of researchers in the UK analyzed 94 data sets--with more than 500,000 images--commonly used to train AI algorithms to spot eye diseases. They found that almost all of the data came from patients in North America, Europe, and China. Just four data sets came from South Asia, two from South America, and one from Africa; none came from Oceania.


Machine learning helps grow artificial organs

#artificialintelligence

Researchers from the Moscow Institute of Physics and Technology, Ivannikov Institute for System Programming, and the Harvard Medical School-affiliated Schepens Eye Research Institute have developed a neural network capable of recognizing retinal tissues during the process of their differentiation in a dish. Unlike humans, the algorithm achieves this without the need to modify cells, making the method suitable for growing retinal tissue for developing cell replacement therapies to treat blindness and conducting research into new drugs. The study was published in Frontiers in Cellular Neuroscience. In multicellular organisms, the cells making up different organs and tissues are not the same. They have distinct functions and properties, acquired in the course of development. They start out the same, as so-called stem cells, which have the potential to become any kind of cell the mature organism incorporates.


AI Can Help Diagnose Some Illnesses--if Your Country Is Rich

WIRED

Artificial intelligence promises to expertly diagnose disease in medical images and scans. However, a close look at the data used to train algorithms for diagnosing eye conditions suggests these powerful new tools may perpetuate health inequalities. A team of researchers in the UK analyzed 94 datasets--with more than 500,000 images--commonly used to train AI algorithms to spot eye diseases. They found that almost all of the data came from patients in North America, Europe, and China. Just four datasets came from South Asia, two from South America, and one from Africa; none came from Oceania.