-
公开(公告)号:US20230419694A1
公开(公告)日:2023-12-28
申请号:US18462930
申请日:2023-09-07
Applicant: Verily Life Sciences LLC
Inventor: Martin Stumpe , Philip Nelson , Lily Peng
CPC classification number: G06V20/69 , G16H30/40 , G01N1/30 , G06N3/08 , G06T7/0012 , G06T11/001 , G06F18/214 , G06V10/82 , G06V20/695 , G01N2001/302 , G06T2207/20081 , G06T2207/20084 , G06T2207/30024 , G06T2210/41 , G06V2201/03
Abstract: A machine learning predictor model is trained to generate a prediction of the appearance of a tissue sample stained with a special stain such as an IHC stain from an input image that is either unstained or stained with H&E. Training data takes the form of thousands of pairs of precisely aligned images, one of which is an image of a tissue specimen stained with H&E or unstained, and the other of which is an image of the tissue specimen stained with the special stain. The model can be trained to predict special stain images for a multitude of different tissue types and special stain types, in use, an input image, e.g., an H&E image of a given tissue specimen at a particular magnification level is provided to the model and the model generates a prediction of the appearance of the tissue specimen as if it were stained with the special stain. The predicted image is provided to a user and displayed, e.g., on a pathology workstation.
-
公开(公告)号:US11783603B2
公开(公告)日:2023-10-10
申请号:US16958555
申请日:2018-03-07
Applicant: VERILY LIFE SCIENCES LLC.
Inventor: Martin Stumpe , Philip Nelson , Lily Peng
IPC: G06K9/62 , G06V20/69 , G16H30/40 , G01N1/30 , G06N3/08 , G06T7/00 , G06T11/00 , G06F18/214 , G06V10/82
CPC classification number: G06V20/69 , G01N1/30 , G06F18/214 , G06N3/08 , G06T7/0012 , G06T11/001 , G06V10/82 , G06V20/695 , G16H30/40 , G01N2001/302 , G06T2207/20081 , G06T2207/20084 , G06T2207/30024 , G06T2210/41 , G06V2201/03
Abstract: A machine learning predictor model is trained to generate a prediction of the appearance of a tissue sample stained with a special stain such as an IHC stain from an input image that is either unstained or stained with H&E. Training data takes the form of thousands of pairs of precisely aligned images, one of which is an image of a tissue specimen stained with H&E or unstained, and the other of which is an image of the tissue specimen stained with the special stain. The model can be trained to predict special stain images for a multitude of different tissue types and special stain types, in use, an input image, e.g., an H&E image of a given tissue specimen at a particular magnification level is provided to the model and the model generates a prediction of the appearance of the tissue specimen as if it were stained with the special stain. The predicted image is provided to a user and displayed, e.g., on a pathology workstation.
-
公开(公告)号:US20220148169A1
公开(公告)日:2022-05-12
申请号:US17453953
申请日:2021-11-08
Applicant: Verily Life Sciences LLC
Inventor: Craig Mermel , Yun Liu , Naren Manoj , Matthew Symonds , Martin Stumpe , Lily Peng , Kunal Nagpal , Ellery Wulczyn , Davis Foote , David F. Steiner , Po-Hsuan Cameron Chen
Abstract: One example method for AI prediction of prostate cancer outcomes involves receiving an image of prostate tissue; assigning Gleason pattern values to one or more regions within the image using an artificial intelligence Gleason grading model, the model trained to identify Gleason patterns on a patch-by-patch basis in a prostate tissue image; determining relative areal proportions of the Gleason patterns within the image; assigning at least one of a risk score or risk group value to the image based on the determined relative areal proportions; and outputting at least one of the risk score or the risk group value.
-
公开(公告)号:US11983912B2
公开(公告)日:2024-05-14
申请号:US16958548
申请日:2018-09-07
Applicant: VERILY LIFE SCIENCES LLC
Inventor: Martin Stumpe , Lily Peng
CPC classification number: G06V10/25 , G01N1/30 , G06T7/0012 , G06V20/695 , G06T2207/20081 , G06T2207/20084 , G06T2207/30024 , G06T2207/30068
Abstract: A method for training a pattern recognizer to identify regions of interest in unstained images of tissue samples is provided. Pairs of images of tissue samples are obtained, each pair including an unstained image of a given tissue sample and a stained image of the given tissue sample. An annotation (e.g., drawing operation) is then performed on the stained image to indicate a region of interest. The annotation information, in the form of a mask surrounding the region of interest, is then applied to the corresponding unstained image. The unstained image and mask are then supplied to train a pattern recognizer. The trained pattern recognizer can then be used to identify regions of interest within novel unstained images.
-
-
-