Search

Wilfried M Osberger

age ~52

from Portland, OR

Also known as:
  • Wilfried Michael Osberger
  • Wilfred M Osburger
Phone and address:
5815 24Th Ave, Portland, OR 97211

Wilfried Osberger Phones & Addresses

  • 5815 24Th Ave, Portland, OR 97211
  • 330 Uptown Ter, Portland, OR 97210 • 5032730165
  • Beaverton, OR

Work

  • Position:
    Food Preparation and Serving Related Occupations

Education

  • Degree:
    High school graduate or higher

Us Patents

  • Robust Camera Motion Estimation For Video Sequences

    view source
  • US Patent:
    6738099, May 18, 2004
  • Filed:
    Feb 16, 2001
  • Appl. No.:
    09/788135
  • Inventors:
    Wilfried M. Osberger - Portland OR
  • Assignee:
    Tektronix, Inc. - Beaverton OR
  • International Classification:
    H04N 712
  • US Classification:
    348699, 37524016, 382236
  • Abstract:
    A robust technique for estimating camera motion parameters calculates the motion vectors for a current frame vis a vis a previous frame using a multi-scale block matching technique. The means of the motion vectors for the current and the previous frames are compared for a temporal discontinuity, the detection of such temporal discontinuity as a temporal repetition, such as frozen field or 3:2 pulldown, terminating the processing of the current frame with the camera motion parameter estimate for the previous frame being used for the current frame. Otherwise motion vectors for spatially flat areas and text/graphic overlay areas are discarded and an error-of-fit for the previous frame is tested to determine initial parameters for an iterative camera motion estimation process. If the error-of-fit is less than a predetermined threshold, then the camera motion parameters for the previous frame are used as the initial parameters, otherwise a best least squares fit is used as the initial parameters. Outlier motion vectors are removed and the camera motion parameters are recalculated using a least squares best fit in an iterative process until either the error-of-fit for the current frame is less than the predetermined threshold or the number of iterations exceeds a maximum.
  • Measurement Of Blurring In Video Sequences

    view source
  • US Patent:
    7099518, Aug 29, 2006
  • Filed:
    Jul 18, 2002
  • Appl. No.:
    10/198944
  • Inventors:
    Bei Li - Beaverton OR, US
    Wilfried M. Osberger - Portland OR, US
  • Assignee:
    Tektronix, Inc. - Beaverton OR
  • International Classification:
    G06K 9/40
    G06K 9/48
  • US Classification:
    382255, 382199, 382266, 382263
  • Abstract:
    A method for determine blurring in a test video sequence due to video processing includes detecting blocks within each frame of the test video sequence that have valid image edges. An edge point within each valid image edge block is selected and a series of points defining an edge profile in the block along a line normal to the valid image edge at each edge point is defined from an enhanced edge map in which video processing blockiness artifacts have been removed. From the edge profile a blurring value is estimated for each frame or group of frames within the test video sequence. Additionally a reference blurring value may be derived from a reference video sequence corresponding to the test video sequence, which reference blurring value may be generated at a source of the reference video sequence and transmitted with the test video sequence to a receiver or may be generated at the receiver. The reference blurring value is then compared with the blurring value from the test video sequence to produce a relative blurring value for the test video sequence.
  • Visual Attention Model

    view source
  • US Patent:
    20020126891, Sep 12, 2002
  • Filed:
    Jan 17, 2001
  • Appl. No.:
    09/764726
  • Inventors:
    Wilfried Osberger - Portland OR, US
  • International Classification:
    G06K009/34
  • US Classification:
    382/165000, 382/173000
  • Abstract:
    An improved visual attention model uses a robust adaptive segmentation algorithm to divide a current frame of a video sequence into a plurality of regions based upon both color and luminance, with each region being processed in parallel by a plurality of spatial feature algorithms including color and skin to produce respective spatial importance maps. The current frame and a previous frame are also processed to produce motion vectors for each block of the current frame, the motion vectors being compensated for camera motion, and the compensated motion vectors being converted to produce a temporal importance map. The spatial and temporal importance maps are combined using weighting based upon eye movement studies.

Get Report for Wilfried M Osberger from Portland, OR, age ~52
Control profile