{"id":1363,"date":"2020-04-26T16:36:27","date_gmt":"2020-04-26T20:36:27","guid":{"rendered":"https:\/\/sites.bu.edu\/tianlab\/?page_id=1363"},"modified":"2025-03-28T21:00:21","modified_gmt":"2025-03-29T01:00:21","slug":"computational-imaging-with-metasurface","status":"publish","type":"page","link":"https:\/\/sites.bu.edu\/tianlab\/publications\/computational-imaging-with-metasurface\/","title":{"rendered":"Computational imaging with metasurface"},"content":{"rendered":"<p><strong><a href=\"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/nanoph-2024-0759\/html\">Chiral phase-imaging meta-sensors<\/a><\/strong><br \/>\nAhmet M. Erturan , Jianing Liu , Maliheh A. Roueini , Nicolas Malamug , Lei Tian &amp; Roberto Paiella<br \/>\n<em><strong>Nanophotonics<\/strong><\/em> (2025).<br \/>\n<img loading=\"lazy\" src=\"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/nanoph-2024-0759\/asset\/graphic\/j_nanoph-2024-0759_fig_001.jpg\" alt=\"Figure 1: Chiral metasurface photodetectors. (a) Schematic device structure. At the target detection angle, one circular polarization component of the incident light is reflected, while the other is coupled to SPPs. (b) Calculated transmission through the device metasurface for RCP and LCP light at \u03bb0 = 1,550\u202fnm versus angle of incidence \u03b8 on the x\u2013z plane. The shaded region shows the range of possible angles of incidence on the sensor array for a representative microscope configuration with 0.8 objective numerical aperture and 20\u00d7 magnification. (c) Reciprocal-space diagram illustrating the plasmon excitation process in these devices, for light incident along 3 representative directions (labeled A, B, and B\u2032). The red and orange arrows represent the wavevector k SPP and spin angular momentum \u03c3 SPP of the excited SPPs. The combined phase-matching action of the metasurface resonance and PB phase is indicated by the horizontal black arrows. The panel on the right-hand side shows the directional relation between in-plane wavevector k || (blue arrow) and spin \u03c3 || (green arrow) for LCP and RCP light. The SPP excitation efficiency is enhanced (suppressed) when \u03c3 || is parallel (antiparallel) to \u03c3 SPP . \" class=\"aligncenter\" width=\"714\" height=\"180\" \/><\/p>\n<p><a href=\"https:\/\/opg.optica.org\/ol\/abstract.cfm?uri=ol-49-20-5759\"><strong>Cell classification with phase-imaging meta-sensors<\/strong><\/a><br \/>\nHaochuan Hu, Jianing Liu, Lei Tian, Janusz Konrad, and Roberto Paiella<br \/>\n<strong>Optics Letters<\/strong> Vol. 49, Issue 20, pp. 5759-5762 (2024).<\/p>\n<p xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\">The development of photonic technologies for machine learning is a promising avenue toward reducing the computational cost of image classification tasks. Here we investigate a convolutional neural network (CNN) where the first layer is replaced by an image sensor array consisting of recently developed angle-sensitive metasurface photodetectors. This array can visualize transparent phase objects directly by recording multiple anisotropic edge-enhanced images, analogous to the feature maps computed by the first convolutional layer of a CNN. The resulting classification performance is evaluated for a realistic task (the identification of transparent cancer cells from seven different lines) through computational-imaging simulations based on the measured angular characteristics of prototype devices. Our results show that this hybrid optoelectronic network can provide accurate classification (&gt;90%) similar to its fully digital baseline CNN but with an order-of-magnitude reduction in the number of calculations.<\/p>\n<p><img loading=\"lazy\" src=\"\/tianlab\/files\/2024\/10\/PIMS-636x380.jpeg\" alt=\"\" width=\"500\" height=\"299\" class=\"aligncenter wp-image-2369\" srcset=\"https:\/\/sites.bu.edu\/tianlab\/files\/2024\/10\/PIMS-636x380.jpeg 636w, https:\/\/sites.bu.edu\/tianlab\/files\/2024\/10\/PIMS-1024x612.jpeg 1024w, https:\/\/sites.bu.edu\/tianlab\/files\/2024\/10\/PIMS-768x459.jpeg 768w, https:\/\/sites.bu.edu\/tianlab\/files\/2024\/10\/PIMS.jpeg 1455w\" sizes=\"(max-width: 500px) 100vw, 500px\" \/><\/p>\n<p><a href=\"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/nanoph-2023-0354\/html\"><strong>Asymmetric metasurface photodetectors for single-shot quantitative phase imaging<\/strong><\/a><br \/>\nJianing Liu , Hao Wang , Yuyu Li , Lei Tian, Roberto Paiella<br \/>\n<em><strong>Nanophotonics<\/strong><\/em> (2023).<\/p>\n<div class=\"abstract\">\n<p>The visualization of pure phase objects by wavefront sensing has important applications ranging from surface profiling to biomedical microscopy, and generally requires bulky and complicated setups involving optical spatial filtering, interferometry, or structured illumination. Here we introduce a new type of image sensors that are uniquely sensitive to the local direction of light propagation, based on standard photodetectors coated with a specially designed plasmonic metasurface that creates an asymmetric dependence of responsivity on angle of incidence around the surface normal. The metasurface design, fabrication, and angle-sensitive operation are demonstrated using a simple photoconductive detector platform. The measurement results, combined with computational imaging calculations, are then used to show that a standard camera or microscope based on these metasurface pixels can directly visualize phase objects without any additional optical elements, with state-of-the-art minimum detectable phase contrasts below 10\u202fmrad. Furthermore, the combination of sensors with equal and opposite angular response on the same pixel array can be used to perform quantitative phase imaging in a single shot, with a customized reconstruction algorithm which is also developed in this work. By virtue of its system miniaturization and measurement simplicity, the phase imaging approach enabled by these devices is particularly significant for applications involving space-constrained and portable setups (such as point-of-care imaging and endoscopy) and measurements involving freely moving objects.<\/p>\n<p><img loading=\"lazy\" src=\"\/tianlab\/files\/2023\/08\/asp_phase-636x535.jpeg\" alt=\"\" width=\"636\" height=\"535\" class=\"aligncenter wp-image-2182 size-medium\" srcset=\"https:\/\/sites.bu.edu\/tianlab\/files\/2023\/08\/asp_phase-636x535.jpeg 636w, https:\/\/sites.bu.edu\/tianlab\/files\/2023\/08\/asp_phase-1024x862.jpeg 1024w, https:\/\/sites.bu.edu\/tianlab\/files\/2023\/08\/asp_phase-768x647.jpeg 768w, https:\/\/sites.bu.edu\/tianlab\/files\/2023\/08\/asp_phase.jpeg 1328w\" sizes=\"(max-width: 636px) 100vw, 636px\" \/><\/p>\n<\/div>\n<p><a href=\"https:\/\/opg.optica.org\/oe\/fulltext.cfm?uri=oe-30-16-29074&amp;id=481502\"><strong>Optical spatial filtering with plasmonic directional image sensors<\/strong><\/a><br \/>\nJianing Liu, Hao Wang, Leonard C. Kogos, Yuyu Li, Yunzhe Li, Lei Tian, and Roberto Paiella<br \/>\n<em><strong>Optics Express<\/strong><\/em> Vol. 30, Issue 16, pp. 29074-29087 (2022).<br \/>\n<span><span style=\"color: #993300;\"><strong>\u2b51<\/strong><strong><em> Editors&#8217; pick<\/em><\/strong><\/span><\/span><\/p>\n<p xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\">Photonics provides a promising approach for image processing by spatial filtering, with the advantage of faster speeds and lower power consumption compared to electronic digital solutions. However, traditional optical spatial filters suffer from bulky form factors that limit their portability. Here we present a new approach based on pixel arrays of plasmonic directional image sensors, designed to selectively detect light incident along a small, geometrically tunable set of directions. The resulting imaging systems can function as optical spatial filters without any external filtering elements, leading to extreme size miniaturization. Furthermore, they offer the distinct capability to perform multiple filtering operations at the same time, through the use of sensor arrays partitioned into blocks of adjacent pixels with different angular responses. To establish the image processing capabilities of these devices, we present a rigorous theoretical model of their filter transfer function under both coherent and incoherent illumination. Next, we use the measured angle-resolved responsivity of prototype devices to demonstrate two examples of relevant functionalities: (1) the visualization of otherwise invisible phase objects and (2) spatial differentiation with incoherent light. These results are significant for a multitude of imaging applications ranging from microscopy in biomedicine to object recognition for computer vision.<\/p>\n<p xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\"><img loading=\"lazy\" src=\"\/tianlab\/files\/2023\/08\/asp_abs.jpeg\" alt=\"\" width=\"500\" height=\"159\" class=\"aligncenter wp-image-2185 size-full\" \/><\/p>\n<p><a href=\"https:\/\/www.nature.com\/articles\/s41467-020-15460-0\"><strong>Plasmonic ommatidia for lensless compound-eye vision<\/strong><\/a><br \/>\nLeonard C. Kogos, Yunzhe Li, Jianing Liu, Yuyu Li, Lei Tian &amp; Roberto Paiella<br \/>\n<strong><em>Nature Communications<\/em><\/strong> 11: 1637 (2020).<br \/>\n<span style=\"color: #800000;\"><strong><span><span style=\"color: #993300;\">\u2b51<em> <\/em><\/span><\/span>In the news:<\/strong><\/span><br \/>\n&#8211; BU ENG news:\u00a0<a href=\"http:\/\/www.bu.edu\/eng\/2020\/04\/03\/a-bugs-eye-view\/\">A Bug\u2019s-Eye View<\/a><\/p>\n<p><strong><span><span style=\"color: #993300;\"><span style=\"color: #0000ff;\">\u2b51<\/span><em>\u00a0<\/em><\/span><\/span><a href=\"https:\/\/github.com\/bu-cisl\/Plasmonic-ommatidia-for-lensless-compound-eye-vision\">Github Project<\/a><\/strong><\/p>\n<p>The vision system of arthropods such as insects and crustaceans is based on the compound-eye architecture, consisting of a dense array of individual imaging elements (ommatidia) pointing along different directions. This arrangement is particularly attractive for imaging applications requiring extreme size miniaturization, wide-angle fields of view, and high sensitivity to motion. However, the implementation of cameras directly mimicking the eyes of common arthropods is complicated by their curved geometry. Here, we describe a lensless planar architecture, where each pixel of a standard image-sensor array is coated with an ensemble of metallic plasmonic nanostructures that only transmits light incident along a small geometrically-tunable distribution of angles. A set of near-infrared devices providing directional photodetection peaked at different angles is designed, fabricated, and tested. Computational imaging techniques are then employed to demonstrate the ability of these devices to reconstruct high-quality images of relatively complex objects.<\/p>\n<p><img loading=\"lazy\" src=\"\/tianlab\/files\/2020\/04\/ASP-636x435.png\" alt=\"\" width=\"636\" height=\"435\" class=\"size-medium wp-image-1315 aligncenter\" \/><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Chiral phase-imaging meta-sensors Ahmet M. Erturan , Jianing Liu , Maliheh A. Roueini , Nicolas Malamug , Lei Tian &amp; Roberto Paiella Nanophotonics (2025). Cell classification with phase-imaging meta-sensors Haochuan Hu, Jianing Liu, Lei Tian, Janusz Konrad, and Roberto Paiella Optics Letters Vol. 49, Issue 20, pp. 5759-5762 (2024). The development of photonic technologies for [&hellip;]<\/p>\n","protected":false},"author":12228,"featured_media":0,"parent":133,"menu_order":9,"comment_status":"closed","ping_status":"closed","template":"page-templates\/no-sidebars.php","meta":[],"_links":{"self":[{"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/pages\/1363"}],"collection":[{"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/users\/12228"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/comments?post=1363"}],"version-history":[{"count":10,"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/pages\/1363\/revisions"}],"predecessor-version":[{"id":2416,"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/pages\/1363\/revisions\/2416"}],"up":[{"embeddable":true,"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/pages\/133"}],"wp:attachment":[{"href":"https:\/\/sites.bu.edu\/tianlab\/wp-json\/wp\/v2\/media?parent=1363"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}