Valley of the Sun Casual Club
Welcome to VOTSCC . Please enjoy the many features . You may login at anytime to be part of our community .
Valley of the Sun Casual Club
Would you like to react to this message? Create an account in a few clicks or log in to continue.
Log in

I forgot my password

April 2024
MonTueWedThuFriSatSun
1234567
891011121314
15161718192021
22232425262728
2930     

Calendar Calendar

Statistics
We have 474 registered users
The newest registered user is bitaacademy

Our users have posted a total of 44629 messages in 6567 subjects
71 WGT TUTORIALS & 32 YOUNG46 TUTORIALS
CLICK HERE TO SEE OVER 100 YOUTUBE VIDEO TUTORIALS . FROM WGTers , WGT & YOUNG46
FORUM UPDATE
TO THE MANY WELCOME GUESTS . THIS FORUM IS NO LONGER A COUNTRY CLUB WEBSITE FOR A WGT COUNTRY CLUB . PLEASE FEEL FREE TO READ THE FORUMS.
THERE ARE MANY TOPICS OF INTEREST . OR NOT . THIS WEBSITE IS AN INFORMATION AND ENTERTAINMENT WEBSITE ONLY .
MUCH OF THE CONTENT IS ARCHIVES OF PURPOSES PAST .
THERE ARE SOME MORE CURRENT TOPICS .
REGISTRATION IS NOT NECESSARY TO READ THROUGHOUT .
REGISTRATION IS EASY AND FREE . THIS IS AN AD FREE WEBSITE . NOTHING IS EVER REQUESTED FROM REGISTERED MEMBERS .
REGISTRATION ENABLES COMMENTING ON TOPICS . POSTING NEW TOPICS . FULL ACCESS TO THE WEBSITE IMAGE HOST . WHICH IS A VERY COMPLETE AND CONVENIENT TOOL .
PLEASE ENJOY .

Bilko’s Putting Calc
Here is a link to Bilko's Putting Calc and Wind Calc
Just download and install
TIER & AVERAGE REQUIREMENTS
BASIC LEVEL AND AVERAGE REQUIREMENTS , AND SATURATION

WHILE YOUR HERE
WHILE YOUR HERE :
CHECK OUT THE INCREDIBLE PHOTOGRAPHY IN
MY SERIES

THIS USED TO BE THE HOME OF OUR WORLD CLOCK . WHICH CAN NOW BE FOUND IN ITS OWN FORUM ON THE MAIN PAGE ..
THERE ARE MORE WORLD CLOCKS INSIDE HERE .

WORLD CLOCK

FB Like

VEM VARIOUS PATENTS AND PURPOSES

2 posters

Go down

VEM VARIOUS PATENTS AND PURPOSES Empty VEM VARIOUS PATENTS AND PURPOSES

Post by Paul Sun 27 Mar 2022, 10:27 pm

virtual equipment module;  Assignee: World Golf Tour;  
Automatically adapting virtual equipment model

Abstract



Methods and apparatus, including computer program products, for determining a user skill level for user interaction with virtual equipment in an interactive computer game. The virtual equipment is capable of being manipulated through user interaction with an associated representation. Automatically adapting a virtual equipment model associated with the virtual equipment to reflect the determined user skill level. The virtual equipment model governs how the virtual equipment behaves in response to user interaction with the representation.


Last edited by Paul on Wed 30 Mar 2022, 11:46 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Re: VEM VARIOUS PATENTS AND PURPOSES

Post by Paul Sun 27 Mar 2022, 10:28 pm

Description

How to automatically fit your virtual equipment model {METHOD FOR AUTOMATICALLY ADAPTING VIRTUAL EQUIPMENT MODEL}
 The present invention relates to a method for automatically fitting a virtual equipment model.
Computer games and other kinds of simulations typically include a virtual universe that interacts with users to accomplish one or more goals, such as killing all "villains" or playing a hole in golf. A virtual space is a paradigm that a user interacts with when playing a computer game, and may include an indication of information of a virtual environment, equipment, objects, characters, and related states. For example, the virtual space may include a virtual golf course, golf clubs and golf balls. Users interact with the virtual space through a user interface that can accept input from game controllers (eg, joysticks, mice, voice commands). For example, a virtual golf club can be swinged on a virtual golf course by a click of a mouse button to hit a virtual golf ball.
Typical computer game genres include role-playing, first-person shooter, third-person shooter, sports, racing, martial arts, action, strategy and simulation. Computer games can incorporate a combination of two or more genres. Popular computer games include Black & White 2 (released by EA Games), Grand Theft Auto (released by Rockstar Games), and Perfect Dark Zero. ) (Released from Microsoft Game Studios) and Halo 3 (released from Microsoft Game Studios). Computer games include workstations, personal computers, game consoles 104 (e.g., Sony PlayStation, PlayStation Portable, Microsoft Xbox, Nintendo GameCube, and It is generally available on different computer platforms such as Game Boy, mobile phone 102 and other mobile computer devices. See FIG. 1. Computer games can be single or multiplayer. Some multiplayer games allow users connected over the Internet to interact in a common or shared virtual space.
Users interact in virtual space with one or more virtual devices, such as virtual weapons or virtual golf clubs. The virtual equipment can include avatars representing the user and other virtual representations, including but not limited to movements or gestures of the user. In summary, a martial arts game allows a user to hit, kick, or punch a virtual opponent by hand in a virtual space. The virtual equipment in these cases is the virtual representation of the user (or user's movement or gesture) in the fight.
Virtual space and virtual equipment can change as users achieve their goals. For example, as users progress to higher game levels in an action game, the virtual space is typically changed to model a new level, with users having different virtual equipment, such as more powerful weapons. Some computer games allow users to manually select a virtual device. For example, user interface 106 (FIG. 1) for a computer golf game allows users to select the type of virtual golf club they wish to use. Users lacking skills may choose fairway wood club 108 over driver 110, which is more difficult (and indeed true) to control in virtual space. However, when using virtual equipment, in a computer game, the virtual equipment is not automatically adapted to the user's proficiency with a given virtual equipment.
In general, in one aspect, embodiments of the present invention are characterized in determining user proficiency for interaction of a user with a virtual device in an interactive computer game. The virtual equipment can be manipulated by the user's interaction with the associated display. The virtual equipment model associated with the virtual equipment is automatically adapted to reflect the determined user proficiency. The virtual device model controls the operation of the virtual device in response to user interaction with the display.
These and other embodiments may optionally include one or more of the following features. The fitting includes changing a sweet spot for the virtual equipment. The sweet spot is the region of the distribution curve for the variable associated with the virtual equipment model. The sweet spot is related to one or more of the accuracy of the user interaction and the precision of the user interaction. The fitting includes changing the input model or related display. The adapting step is based on the current state of the virtual space. The determining step is made in response to the step of detecting an improvement or decrease in user proficiency. The display unit includes one or more of graphic rendering, sound, or haptic feedback. The adapting further includes changing one or more relationships between a plurality of variables of the user interaction model. The virtual equipment is one of a golf club, weapon, car, racket, ping pong stick, or baseball bat.
In general, in another aspect, embodiments of the invention are characterized by determining user proficiency for interaction of a user with a virtual device in an interactive computer game. The virtual equipment can be manipulated by the user's interaction with the associated display. Sweet spots associated with the virtual device are automatically adapted based on the determined user proficiency, and the sweet spot controls the operation of the virtual device in response to interaction of the user and the display.
These and other embodiments may optionally include one or more of the following features. The sweet spot is the region of the distribution curve for the virtual machine related variable. The sweet spot is related to one or more of the accuracy of the user interaction and the precision of the user interaction. The fitting includes changing the input model or related display.
Certain embodiments of the present invention may be implemented to realize one or more of the following advantages. Virtual equipment is automatically adapted to reflect changes in user proficiency to keep users challenged as they improve. As a result, users do not lose interest in computer games. The visual representation of the associated user input model and the virtual equipment is automatically modified to reflect changes in user skill. Automatic adaptation of virtual equipment adds a new level of realism to electronic technology games and other kinds of simulations, and provides a more accurate reflection of technology in the virtual world without being disturbed by static and limited user interfaces.
The details of embodiments of the invention are set forth in the drawings and the description below. Other features, aspects, and advantages of the invention will be apparent from the description, the drawings, and the claims.
1 shows a user interface for selecting a golf club.
2 shows four exemplary graphs of equipment control.
3 illustrates a virtual equipment model system.
4 illustrates a virtual equipment model adaptation process.
5 shows a system architecture.
Like reference numbers and designations in the various drawings indicate like elements.
In various implementations, a given virtual device has one or more associated "sweet spots". Sweet spots represent the limitations of a user's failure to interact with one virtual device that produces an intended result in the virtual space. In one embodiment, the large sweet spot corresponds to a larger deviation in the normalized distribution curve, and the small sweet spot corresponds to a smaller deviation in the normalized distribution curve.
For example, different types of golf clubs exist for golfers with different abilities, and each golf club has a variety of sized sweet spots that vary in location. In general, a golfer can select a club based on his swing speed and power and the sweet spot of the golf club. Clubs with large sweet spots tend to be very generous because they are designed to have a large surface area in contact with the ball and have a perimeter weigh distribution to balance typos. Club swings with large sweet spots will produce adequate shots even if they are separated by a few standard deviations from the mean (the average is a perfect swing). However, golfers who have large sweet spots have no choice but to give up some control, power and sensation. Professional golf clubs have very small sweet spots and require more skill to hit the ball correctly, but hitting the right shot can result in greater distance, control, precision and accuracy. Club swings with small sweet spots should be closer to average for a satisfactory shot.
In reality, as users become more skilled with the equipment, the equipment they have is easier to handle and they can choose new equipment that can provide an improved level of control. This forms the basis for automatically adjusting some of the sweet spot (s) of the virtual machine according to the user's skill. Graph 202 of FIG. 2 shows standard deviation curves 202b, 202c, 202d of variables related to the same or different portion of the virtual equipment. For example, curve 202b may represent the power of the virtual golf swing, curve 202c may represent the direction of the virtual golf club surface when the virtual golf swing hits the virtual golf ball, and curve 202d. ) May represent the trajectory of the kick or punch of the virtual fighter. A zero deviation represents the ideal value of a variable (eg, a small sweet spot) for a portion of the virtual equipment, such as the ideal power of the virtual golf club swing or the ideal aim of the virtual gun. For a given variable, the value of each standard deviation from zero represents a value that is much smaller than the ideal value. In one implementation, values higher than threshold 202a (which may vary with each curve) are more likely than targets, such as sending a virtual golf ball to a location intended by the user, relative to values below the threshold. Achievable). Sweet spots can be viewed as areas of the distribution curve that are above the threshold and within the required standard deviation from the mean. For example, in the case of large sweet spots, a successful result can be obtained if the value of a given variable is above the threshold, even if the result is not ideal. Furthermore, the sweet spots may vary depending on the type of virtual equipment. For example, curve 202b may represent a professional forged golf iron club with a very small sweet spot (e.g., +/- 1 standard deviation), and curve 202c may represent a much larger sweet spot (e.g., For example, a hollow back off set beginner iron club with +/- 1.8 standard deviation may be indicated.
As users become more experienced with the use of virtual machines, sweet spots for one or more virtual machine variables are adapted, allowing the user to interact with the virtual machine to a value closer to the average of the virtual machine's variable values. Only successful results are produced. Similarly, if the user's mastery decreases, the sweet spot for one or more virtual devices may be adapted to achieve a successful outcome even if the user achieves interaction with the virtual device at a value farther than the average of one or more virtual device variables. have.
Accuracy is the probability that a given virtual device will run as the user intended. An example of accuracy is the probability that the virtual golf ball will follow the intended track and fall to the target by the swing of the virtual golf club. In other words, the accuracy may be the probability that the virtual target is hit by the shooting of the virtual gun. Precision is the probability that a user's interaction with a given virtual device will yield the same result over time. For example, precision is the probability that the same golf club swing will result in the same result. In one implementation, the accuracy and precision of a given portion of virtual equipment may automatically increase as the user's skill increases. Similarly, the accuracy and precision of a given virtual device can be automatically reduced as the user's skill level decreases. This relationship is illustrated in the example graphs 204 and 206 of FIG. In summary, as shown in graph 208, the user's ability to control the virtual equipment increases with their skill level. Although the example graphs 204, 206, and 208 of FIG. 2 schematically illustrate a linear relationship, other relationships are possible and may be specified by the virtual equipment model, as described below.
3 is a diagram of a virtual equipment model (VEM) system 300 for a computer game application or other simulation. The functions included in system 300 may be distributed to fewer or more components than shown. The system 300 includes a VEM 306 that models one virtual device. One virtual device may include one or more objects in a virtual space, such as a set of virtual balls that a user juggles in a computer juggling game. In one implementation, there is a VEM 306 for each one virtual device with which the user interacts in the virtual space. In a further implementation, the VEM 306 maintains a non-empty set of variables and a relationship between two or more variables to model the behavior of one virtual device. In one implementation, the sweet spot for the virtual equipment portion has the opposite relationship to the precision and accuracy of the virtual equipment.
In one implementation, VEM 306 includes a minimum of variables representing precision, accuracy, one or more distribution curves (eg, 202b, 202c), thresholds (eg, 202a), and sweet spots, as described above. do. For example, if the virtual machine is a golf club, variables include stroke power, club face trajectory, distribution curve, and include stroke power, club face trajectory, associated sweet spots and thresholds for club accuracy and club precision. Can be.
In general, the values of the VEM 306 variables are determined by the user's input, the user's skill in using the virtual device, the attributes of the virtual device itself, and the state of the virtual space as determined by the game engine 310 (eg, weather, Player's emotional and physical stress), configuration information, values of one or more other variables, and combinations thereof. The input model 302 stores user input (eg, button presses, voice commands, gestures, eye movements, body movements, brain waves, other types of physiological sensors, and combinations thereof) in relation to the VEM 306. Map to one or more variable values for the variables in the set. VEM 306 interprets user input provided by input model 302 using a set of relationships. The VEM 306 provides an associated display 304 of virtual equipment that provides the user through graphical display means (eg, liquid crystal or plasma display devices), sound generating means, haptic technology, odor generating means, combinations thereof, and the like. Have For example, in a first-person shooter, the virtual gun may have a graphical representation of cross hairs indicating where the gun is currently aiming and sound feedback indicating the moment the virtual gun is fired. A joystick or other user input device can be used to aim the virtual gun and a button can be pressed to fire the virtual gun. VEM 306 communicates with game engine 310 to change the virtual space based on user interaction with VEM 306.
The variable set, the values of the variables, and the relationship with the VEM 306 may vary based on the state of the virtual space or the background and purpose for which the virtual equipment is used. For example, if the virtual equipment is a sword in a sword fighting computer game, successful use of the sword requires proper attack and defense of the user's actions. In addition to the sweet spot (s) associated with the virtual sword, each of the virtual sword's behaviors may itself have associated sweet spot (s), which may vary based on the type of attack or defensive behavior the user attempts. In addition, the sweet spot of the gum may vary based on the type of gum that affects the threshold level.
Proficiency monitor 306 monitors changes in user proficiency. Changes in user proficiency are the proficiency of a user using a given portion of virtual equipment to achieve one or more goals (eg, improved scores) in virtual space, the ability to perform relatively high tasks with virtual equipment. This can be sensed by the use of virtual machines to achieve improved accuracy, the use of virtual machines to achieve improved accuracy, the time the virtual machine is used, and the combination of these and other factors. In one embodiment, user proficiency is quantified numerically. If the skill level increases or decreases beyond a certain threshold, a change is communicated to the VEM 306, which in turn can communicate the change to the input model 302 and the display 304. Using a threshold value other than zero can prevent the VEM 306 from changing too quickly.
Based on the change in proficiency, one or more of the VEM 306, the input model 302, and the display 304 can be adapted to reflect the change. Adaptation of the VEM 306 includes changing one or more values of two variables in a variable set, changing one or more relationships in a relationship set, adding one or more variables to or removing from the variable set, wherein Adding or removing one or more relationships from the relationship set and combinations thereof. If the user's proficiency improves, for example, the virtual machine model 306 may add additional variables to control the virtual machine that was not available at low proficiency and display distribution curves, thresholds, and sweet spots. You can change the variables.
Adapting the input model 302 may include changing the way the user interacts with the display 304 by adding or removing required user input and optional user input. Changing the order of user input, changing the meaning of the user input, and changing the mapping of the user input to one or more variables in the VEM 306 variable set. In summary, if the virtual equipment is a golf club, the user input at one proficiency may include two mouse button clicks. The first click is for setting the power of the stroke, and the second click within a preset time limit from the first click is for determining the trajectory of the golf club surface when hitting the virtual golf ball. User input at higher proficiency may add a third click to determine the loft of the virtual golf ball. Adapting the display 304 may include changing the virtual equipment appearance, user interface, haptic, smell, or a combination thereof. For example, if the input model 302 or the VEM 306 is adapted, the display can be modified to indicate this to the user. For example, the appearance of the virtual golf club may change to indicate that the user is playing at a higher level club.
The game engine 310 maintains the state of the virtual space based on the user input and the interaction of the objects in the virtual space. The game engine 310 may include a renderer for rendering a graphic image of a virtual space that may be provided to the display device. The game engine may also include artificial intelligence capabilities to determine one or more future states of the virtual space. Objects in virtual space, such as virtual equipment, are associated with an asset 312 (eg, content, model, sound, physical properties, artificial intelligence). Assets are used by game engine 310 to display objects and render computer games. Game engine 310 communicates with proficiency monitor 308 to convey user proficiency information, such as a detected change in user proficiency. VEM 306 communicates with game engine 310 to change the virtual space based on the user's interaction with VEM 306.
4 illustrates a virtual equipment model adaptation process. The user's skill level for one virtual device is determined, for example, by skill monitor 308 (step 402). It is then determined whether the skill has increased or decreased beyond the threshold (step 406). If the user's skill level does not increase or decrease beyond the threshold, the user skill level is determined again later in time (step 402). Alternatively, the VEM 306 associated with the virtual equipment is adapted based on user skill level (step 406), for example by changing the value of one or more sweet spots or other variables associated with the virtual equipment. The input model 302 and the indicator 304 can be selectively adapted based on user skill level (step 408), for example to show differently the head of the golf club to emphasize the changed characteristics of the golf club. Can be adapted by.
5 is a block diagram of an example system architecture 500 for automatically fitting a virtual equipment model. The architecture 500 may include one or more processors 502 (e.g., IBM PowerPC®, Intel Pentium® 4, etc.), one or more display devices 504 (e.g., For example, CRT, LCD), one or more graphics processing units 506 (e.g., NVIDIA® Quadra FX 4500, GeForce® 7800 GT, etc.), one One or more network interfaces 508 (e.g., Ethernet, Firewire, USB, etc.), one or more input devices 510 (e.g., keyboard, mouse, game controller, camera) , Microphones, etc.), and one or more computer readable media 512 (eg, SDRAM, optical disk, hard disk, flash memory, Ll or L2 cache, etc.). These components may exchange communication and data over one or more buses 514 (eg, EISA, PCI, PCI Express, etc.).
The term "computer-readable medium" refers to a medium that is involved in providing instructions to the processor 502 for execution, such as non-volatile media (eg, optical or magnetic disks), volatile media (eg, memory). And any transmission medium, including but not limited to. Transmission media include, but are not limited to, coaxial cables, copper wires, and optical fibers. The transmission medium may also take the form of sound waves, light waves or radio frequency waves.
Computer-readable medium 512 may include operating system 516 (eg, Mac OS®, Windows®, Linux, etc.), network communication module 518, computer game. Asset 520, and computer game application 522. The computer game application 522 further includes a game engine 524, a proficiency monitor 526, one or more VEMs 528, one or more input models 530, and one or more displays 532. In some implementations, the electronic game application 522 may be integrated with another application 534 or may be configured as a plug-in of another application 534.
Operating system 516 may have features such as multi-user, multiprocessing, multitasking, multithreading, real time, and the like. The operating system 516 performs basic tasks, recognizes input from the input device 510, sends output to the display device 504, computer readable medium 512 (eg, memory). Or write files and directories to storage devices, control peripheral devices (e.g., disk drives, printers, GPUs 506, etc.), and manage traffic on one or more buses 514. Including but not limited to. The network communication module 518 includes various components for establishing and maintaining a network connection (eg, software for implementing a communication protocol such as TCP / IP, HTTP, Ethernet, etc.). As shown in Figures 2-4, the application 522 implements various tasks and functions with its components.
The user system architecture 500 may be implemented in any electronic or computer device capable of hosting the application 502 or a portion of the application 502. This includes, but is not limited to, portable or desktop computers, workstations, main frame computers, personal digital assistants, portable gaming devices, mobile phones, network servers, and the like. All of these components may be physically separated from each other.
Embodiments of the invention and all functional operations described herein may be implemented in digital electronic circuitry or computer software, firmware, or hardware, and include structures disclosed herein and their structural equivalents or one or more combinations. . Embodiments of the invention may be implemented as a module of one or more computer program products, i.e., one or more computer program instructions encoded on a computer readable medium for execution or control of operation by a data processing apparatus.
Computer programs (also known as programs, software, software applications, scripts, or code) may be written in any programming language, including compiled or interpreted languages, and may be stand-alone. alone may be deployed in any form including programs or modules, components, subroutines, or other units suitable for use in a computer environment. Computer programs do not necessarily correspond to files in the file system. The program may be stored in a portion of a file that contains another program or data (eg, one or more scripts stored in a markup language document), stored in a single file for the program, or multiple coordinated files. Multiple coordinated files (e.g., a file storing one or more modules, subprograms or ports of code). The computer program can be deployed to be executed on one computer or on a plurality of computers located at one location or distributed in a plurality of locations and interconnected to a communication network.
The processes and logic flows described herein may be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The process and logic flow may be performed by special purpose logic circuits such as, for example, field programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs). And the apparatus can also be implemented as such.
Processors suitable for the execution of a computer program include, by way of example, general and special purpose microprocessors, any one or more processors of any kind of digital computer. In general, a processor will receive instructions and data from read-only memory or random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. In general, a computer includes one or more mass storage devices for storing data such as, for example, magnetic, magneto-optical disks, or optical disks, and is operable for receiving data from or transmitting data to it, or for both receiving and transmitting data. To be combined. However, computers do not necessarily have such devices. Further, the computer may be embedded in other devices, such as mobile phones, personal digital assistants (PDAs), portable audio players, global positioning system (GPS) receivers, and the like. Computer-readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media, and memory devices, including, for example, erasable PROM (EPROM), electrically erasable PROM (EEPROM), and flash memory device. Semiconductor memory devices such as, for example, magnetic disks such as internal hard disks or removable disks, magneto-optical disks, and CD-ROM and DVD-ROM disks. The processor and memory may be supplemented or integrated in special purpose logic circuitry.
In order to provide interaction with a user, embodiments of the present invention may be implemented in a computer having a display device such as, for example, a cathode ray tube (CRT) monitor or an LCD (liquid crystal display) monitor to show information to the user, The keyboard and pointing device, for example a mouse or tracking ball, may be implemented to allow a user to provide input to the computer. Other kinds of devices can also be used to provide for interaction with a user. For example, the feedback provided to the user may be in any form such as, for example, visual feedback, auditory feedback, or tactile feedback, and the input received from the user may include sound, language, brain wave or other physiological input, eye movement, It may be received in any form, such as gestures, body movements, or tactile inputs.
Embodiments of the present invention may include, for example, back-end elements such as data servers, middleware elements such as application servers, or graphical user interfaces that allow a user to interact with embodiments of the present invention. Or a front-end element, such as a client computer with a web browser, or any combination of one or more backends, middleware, or front-end elements. The interconnection of the system components may be by digital data communication in any form or medium, for example in a communication network. Examples of communication networks may include a local area network (LAN) or a wide area information network (WAN) such as, for example, the Internet.
Computer systems can include clients and servers. Clients and servers are generally separated from each other and typically interact through a communication network. The relationship between client and server is created by computer programs running on each computer and in a client-server relationship with each other.
While the specification includes several specific details, these should not be construed as limiting the invention or claims, but rather should be construed as describing particular features of specific embodiments of the invention. Certain features that are described in the context of individual embodiments herein can be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment may be implemented in various embodiments individually or in any suitable subcombination. Furthermore, although some features are described as being in a certain combination and so claimed from the outset, one or more features in the claimed combination may optionally be excluded from the combination, and the claimed combination may be of a subcombination or subcombination. It can be induced by deformation.
Similarly, although the operations are shown in a particular order in the drawings, it should not be understood that such operations must be performed in the specific order or sequential order shown to achieve a desired result, or that all operations must be performed. In some circumstances, multitasking and parallel processes may be advantageous. Furthermore, configuring the various system components separately in the above embodiments should not be understood as such separation is required in all embodiments. In addition, the described program components and systems may generally be integrated into a single software product or packaged into a plurality of software products.
Accordingly, certain embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in different ways while at the same time obtaining the desired results.


Last edited by Paul on Wed 30 Mar 2022, 11:45 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Providing offers to computer game players

Post by Paul Sun 27 Mar 2022, 10:35 pm

Abstract



A method of rewarding a game player of a game can target particular types of offers to a player. A usage profile is determined for the player, which can be based on interaction between the player and items or goals in a game. Player information is identified, such as player statistics or categorization of the player. A category is selected for the player, based on the usage profile and the player information for the player. An offer is then selected from a group of offers, wherein the offer is for a real world object, service or event.



Images (7)


VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00000 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00001 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00002 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00003 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00004 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00005 VEM VARIOUS PATENTS AND PURPOSES US08342951-20130101-D00006

Classifications



 


A63F13/79 Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories  
View 14 more classifications


US8342951B2


United States



Download PDF   Find Prior Art  
Similar
InventorYuchiang Cheng Current Assignee  World Golf Tour Inc


Worldwide applications
2008   US   2011   US  


Application US13/198,608 events

2008-03-27

Priority to US12/057,276
2011-08-04

Application filed by World Golf Tour Inc
2011-11-24

Publication of US20110288666A1
2013-01-01

Application granted
2013-01-01

Publication of US8342951B2
Status

Active
2028-03-27

Anticipated expiration
Show all events





InfoPatent citations (40)Non-patent citations (4)Cited by (69)Legal eventsSimilar documentsPriority and Related ApplicationsExternal linksUSPTOUSPTO PatentCenterUSPTO AssignmentEspacenetGlobal DossierDiscuss  

Description



CROSS REFERENCE TO RELATED APPLICATIONS
This application is a Continuation and claims priority to U.S. patent application Ser. No. 12/057,276 filed on Mar. 27, 2008, the entire content of which is incorporated herein by reference.
BACKGROUND
The present disclosure relates to using game play and offers made to the game players.
Computer games and other types of simulations (hereinafter referred to as “games”) recreate fantasy worlds or environments and virtual versions of real world environments (e.g., baseball diamonds, race tracks, and golf courses) through three dimensional (3D) computer generated graphics. Players interact with these worlds solely or in groups in order to achieve goals such as, for example, the accrual of points, the accrual of virtual money or property, killing “bad” guys, virtual trophies or even a real world prize that is provided to a player who achieves the most points or achieves a predetermined goal.
SUMMARY
In one aspect, a method of rewarding a game player is described. The method includes determining a usage profile based on interaction between the player and items or goals in a game, identifying player information based on player statistics, categorizing the player, wherein the categorizing uses the usage profile and the player information to select a category for the player, selecting an offer from a group of offers, wherein the offer is for a real world object, service or event and the selecting step includes selecting from a group of offers available for players in the category and providing the offer to the player. Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
In another aspect, a method of determining game player information is described. The method includes receiving demographic information about the player, receiving game play information reflecting interactions between the player and an item or a goal in a game, compiling the demographic information and game play information to create a profile and providing the profile to a third party for the third party to determine an offer for a real world object, service or event to provide to players matching the profile, wherein the different offers are provided to different players according to the demographic information in the profile. Other embodiments of this aspect include corresponding systems, apparatus, and computer program products.
Implementations can include one or more of the following features. Determining a usage profile can include determining that the player has selected a type of equipment for use in the game, a brand of equipment for use in the game, or a goal to achieve in the game, has selected to use an item within the game with a particular frequency or for a particular length of time or has a particular number of items in a game bag. Identifying player information can include determining psychographic data of the player, demographic data of the player or behavioral variables about the player. The group of offers can include discounts on real world objects or opportunities. Providing the offer can include providing the offer to the player through one of electronic message or a postal service. The real world offer can be received from a third party and provided to the player.
These and other implementations can provide one or more of the following advantages. A player of a game is provided real world offers on items, services or events, such as dinners, parties, concerts, vacations or conventions can provide the player with greater enticement to play the game and achieve particular goals. This can keep a player playing the game longer. If there is any advertising or brand placement within the game, the advertising and brands can receive more of the player's attention time if the player continues to play the game. In turn, the offers for the items or events provide brands and distributors the opportunity to market to individuals who are more likely to be interested in their products. This can provide an efficient way of reaching out to interested consumers without spending time and capital on consumers who have little interest in a product or little ability to purchase the product. That is, marketers can target or customize their brands, services and products to an appropriate audience rather than making a generic offer to all players of a game. The offers for the real world items can be very simple to very complex. For example, the offer can be simply a discount on a real world item that the player also purchased in the game. A complex offer could be based on the individual's income, job, marital status or other demographic or psychographic data in combination with selections made by the player within the game. The techniques described herein can allow for better targeting of an offer to a player. That is, the offer is more likely to be relevant to the player, which can increase the likelihood the player will take advantage of the offer and follow through on a real world purchase. This correlation can be achieved by a greater number of impressions that the player receives of a particular advertisement, type of equipment, or brand.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example of player information.
FIG. 2 illustrates an example of a usage profile.
FIG. 3 is a schematic diagram of an example game system.
FIG. 4 is a schematic diagram of an example player system.
FIG. 5 illustrates an example of a game player reward determination process.
FIG. 6 illustrates an example of a process for presenting an offer to a game player.
Like reference numbers and designations in the various drawings indicate like elements.
DETAILED DESCRIPTION
A computer game allows one or more players to interact with representations of virtual environments, equipment, objects, characters, and associated state information. For instance, a virtual universe can include a virtual version of real world environments and objects, such as a virtual golf course, golf clubs and golf balls, or fabricated environments and objects. Players can interact with one or more pieces of virtual equipment in a virtual universe, such as a virtual weapon or a virtual golf club. The virtual universe can also include avatars and other virtual representations of a player including, but not limited to, a player's movements and gestures. In some implementations, a player can make purchase of real world items or services, or virtual items or services. Examples of virtual items that can be purchased include, but are not limited to, equipment, clothing (e.g., to appear on an avatar), accessories, property, and powers or capabilities for a character or a player's avatar. A player's purchases, whether of virtual or real world goods/services, and behavior during game play can be tracked. Offers for products and services can be identified and presented to the player that are relevant to, for example, a player's perceived personality and affinity for different items or environments. In some implementations, the offers can be based in part on information gathered about the player.
FIG. 1 illustrates an example of player information 100. A game system, such as a virtual golf game system, can access information about players. Some of the player information 100 can, for example, be gathered (e.g., entered by a player) when the player creates an account with the game system. Some of the player information 100 can be gathered as the player uses the game system. In some implementations, the player can update the player information 100, for example by using an edit account interface that allows updates to a player profile. In further implementations, some or all of the player information 100 can be collected automatically by searching various sources of information based on known information about the player such as the Internet Protocol (IP) address of the player's computer or game console, the geographical location of the player (e.g., derived from an IP address), online social or entertainment networks the player belongs to (e.g., Facebook, Friendster, and MySpace), and other suitable sources of information.
The player information 100 can include a player identifier (ID) 102, a player location or address 104, days and times of game play 106, and personal information 108, for example. Other types of player information are also possible. The player ID 102 identifies a player in the game system. Information related to the player can be stored in the game system using the player ID 102 as a key, for example. By way of illustration, the player ID 102 can include a player-entered value (e.g., an identifier chosen and entered by the player, such as a text value), a system-generated value (e.g., an automatically generated identifier, which can be unique and which can be standardized in the game system) or combinations thereof. Other player IDs are also possible. In some implementations, the player ID 102 can be associated with a password.
The player address 104 can include one or more of street address, city, zip code, country, phone number, geocode, IP address, and email address, for example. Other address information is possible. The player address 104 can be entered by the player, such as when the player creates an account. The player address 104 can indicate probabilities of characteristics associated with the player. For example, a player's address can be looked up in a demographic repository that includes various demographic statistics associated with geographic locations. For example, the repository can include distributions of annual income, education level and profession grouped by zip code. Using the demographic repository, the game system can determine, for example, a likely annual income, education level, or profession for the player, based, for example, on the player's zip code or phone number. The demographic repository can be maintained by the game system, or the game system can interface with a third party service provider system to access the demographic repository.
The game system can store the current day and time of each instance that the player plays a game in the day and time of play information 106. For example, a game system can store a start and a stop time each time the player plays a game. In general, the day and time of play can be indicative of a player's profession. Players of particular professions may only be able to play during certain times of the day. For example, players working in certain professions (e.g., “blue-collar” professions) may not be able to play at all during regular first-shift working hours, whereas players working in other professions or players who are self-employed can have the flexibility to play during lunch hours or at various other times throughout the day. A player's flexibility of day and time of play can indicate a player's ability to accept (and therefore have potential interest in) certain offers, such as vacation offers, which require a profession with flexible work arrangements.
The personal information 108 can include marital status, employer, profession, number and age of children, salary, political party, race, gender, sexual orientation, television viewing habits, web browsing habits, and other information of a personal nature. Other player information can include the identification of players or other individuals that a given player considers to be friends, such as through social networking, where the player shops, hobbies or interests, education level, electronic devices the player owns, vehicles the player owns and whether the player owns or rents a home. A player's friends can also be teammates or other players that participate in multi-player games with the player.
The player can enter some or all of the personal information 108. Additionally or alternatively, some or all of the personal information 108 can be indicators, which are determined based on the player's address. The system can also use information from other data in the player information 100 to gather data about the player, such as from publicly available databases. The gathered data can be used to populate the personal information 108. For example, a player's address can determine an income likelihood, as discussed above. In some implementations, the personal information 108 can be used to tailor offers made to the player. For example, offers can be presented to players having a salary equal to or greater than a minimum salary. Certain offers can also be presented to married players, players with children, players in a certain geographic location, etc.
In some implementations, a team or a group of players that play the game together can be provided an offer. The offer can be based on characteristics of the group, such as group demographics or group game goals. Information about each of the individuals on the team can be compiled to result in team information, which is then used to determine an offer to be made to the team. For example, the team information can be used in place of the player information 100.
FIG. 2 illustrates a usage profile 200. Information in the usage profile 200 can be determined based on interaction between the player and items or goals in a game. In general, information in the usage profile 200 can indicate psychographic traits of a player, such as attitudes, personality and lifestyle. More particularly, information in the usage profile 200 can indicate a player's affinity for certain brands and/or types of equipment. Information in the usage profile 200 can be used to tailor offers made to the player.
The usage profile 200 can include the player ID 102, clothing choices 204, brands of gear 206, types of gear 208, number of similar items 210, length of use of items 212, frequency of use of items 214, and goals worked towards 216. One or more of the items 204-212 is optional. The player ID 102 is an identifier of a player, and can be the same identifier used to store the player information 100 (FIG. 1).
A player can choose clothing and accessories that can appear on an avatar representing the player, and the player's selections can be stored in the clothing 204 choices. For example, in a virtual golf game, a player might choose a golf shirt and pants, shoes, gloves, and a hat or visor. As another example, in a virtual fishing game, a player might choose a shirt, hat, sunglasses, shoes or boots, and pants or shorts.
A player can select clothing from various styles and brands. Clothing used in the game can correspond to real-world brands. Contracts can be made with real-world companies to allow companies, e.g., at a defined cost, to include their clothing items in a game. A player can be able to select clothing items for free or a player can virtually buy clothing items using virtual money earned during game play. If a player virtually buys clothing, purchasing patterns can be determined, such as a player's tendency to spend more or less on clothing, as compared to other purchases the player can make (e.g., gear). As another example, a player's tendency to purchase more or less expensive clothing can be determined, based on the clothing items a player purchases. Identified clothing purchasing patterns can be used to tailor offers made to the player. For example, if a player has an affinity for a certain brand of clothing while playing a game, an offer for a real-world clothing item from the same manufacturer can be presented to the player.
A player can select to use gear (i.e., equipment) during game play. For example, in a virtual golf game, a player can select one or more styles and brands of golf clubs to use, a style and brand of golf bag, and a golf cart. As another example, in a virtual fishing game, a player can select a style and brand of one or more rods and reels to use, a style and brand of one or more fishing lures, a style and brand of a tackle box, and a style and brand of fishing boat and motor. A player's gear brand choices can be stored in the brands of gear information 206. In addition, the player's proficiency at using the gear can be determined and stored for later offer determination.
A player can be able to select gear to use for free or a player can have to (for some or all items) virtually buy gear before using the gear. Contracts can be made with real-world companies to allow companies include their gear items in a game. If a player virtually buys gear, purchasing patterns can be determined, such as a player's tendency to spend more or less on gear, as compared to other purchases the player can make (e.g., clothing). As another example, a player's tendency to purchase more or less expensive brands of gear can be determined based on the gear items a player purchases. Identified gear purchasing patterns can be used to tailor offers made to the player. For example, if a player has an affinity for a certain brand of gear while playing a game, an offer for a real-world item from the same manufacturer can be presented to the player.
In addition to choosing among different brands of gear, a player can choose different types of gear. For example, in a virtual golf game, a player can choose different types of golf clubs, such as fairway wood and driver clubs. As another example, in a virtual fishing game, a player can choose different types of reels, such as a fly reel or a spinning reel. A player's gear type choices can be stored in the types of gear 208.
A player's choice of type of gear can indicate a player's personality. For example, use of some gear (e.g., use of a fairway wood club in a virtual golf game) can indicate a more conservative personality than the use of other, more challenging gear (e.g., a driver). That is, some types of gear have more or less risk/reward tradeoffs than other types of gear. As another example, some types of gear (e.g., some types of golf clubs or fishing rods) can be more expensive than other types of gear.
A player's affinity for brands, types of clothing or gear can be determined by storing the number of similar items a player uses or purchases in the number of similar items 210. All purchases, including clothing and gear, can be grouped, such as by brand. For example, in a virtual golf game, a golf club manufacturer can also produce a golf clothing line, and the player's overall affinity for a brand can be determined by looking at both gear and clothing purchases. A player's affinity for a particular type of gear can be determined by examining the number of items of that type that the player uses or purchases. For example, in a virtual golf game, the purchase of a large number of putters can indicate a player's affinity for that type of golf club. As another example, in a virtual fishing game, the purchase of a large number of fly reels can indicate a player's affinity for fly reels. A player's affinity for a particular type of gear can result in offers for that particular type of gear (e.g., putters, fly reels) being presented to the player.
In addition to the purchase choices a player makes, a player's use of items can indicate player traits, such as their affinity for certain brands or types of gear. For example, in a virtual golf game, although a player can have purchased several types and brands of golf clubs, the player can choose to use a particular type or brand of club more often than other types or brands. As another example, in a virtual fishing game, although a player can have purchased many types and brands of lures and can have many lures in a virtual tackle box, the player can choose to use a particular type or brand of lure more often than other types or brands of lures. The length of time a player uses each item can be stored in the length of use of items 212.
A player's choice of a particular type of item in certain situations can indicate aspects of the player's personality. For example, in a virtual golf game, a player can show a pattern of choosing to use iron clubs that would be considered aggressive choices over more conservative clubs. Offers, such as vacation packages, can be tailored to players based on how aggressive the player appears to be (e.g., a “thrill-seeking” vacation package can be offered to players who appear to have an aggressive personality). The avatar selected by the player can have a particular gender, outward appearance and/or sexual orientation. The avatar can also have other social properties, such as belonging to social groups or virtual clubs. Thus, the avatar properties can also be used to determine player preferences and interests.
A player can send messages to other players during game play, such as instant messages or electronic mail messages. The messages can be reviewed for keywords. The keywords can be used to trigger events. The events or keywords can be logged as part of the player information or even be stored in the usage profile.
A game can include goals that a player works towards. Information about player's goals can be stored in the goals worked towards information 214. The particular goals that a player chooses to work towards can indicate aspects of their personality. For example, in a virtual golf game, a player can choose to play various virtual golf courses and a player's pattern of virtual golf course selection can be observed. For example, if a player plays the same course repeatedly, this can indicate a different personality type and interest than a player who never plays the same course twice in a row. As another example, in a virtual fishing game, a player can choose to virtually fish different virtual locations which can correspond to real geographic locations. A player's selection of virtual golf courses or virtual fishing locations can indicate an affinity for particular real-world locations. For example, a player who shows an affinity for virtual golf courses representing real-world courses in Florida can be presented with real-world vacation or golf packages located in Florida.
A player's activity related to goal achievement can indicate personality traits of the player. For example, in a virtual golf game, the player can work towards goals associated with a virtual course, such as beating an established par score for the course. Some players can choose to repeatedly play the same course until they beat the par score, while other players can show a pattern of trying a variety of courses, regardless of whether they beat par scores. A player's indicated preference of variety can be used, for example, to tailor offers presented to the player, such as vacation offers. As another example, in a virtual golf game, a player's aggressiveness style can be determined based on the difficulty level of virtual golf courses selected.
FIG. 3 illustrates an example of a system architecture 300. There may be fewer or more components than those illustrated in the system architecture 300. The system architecture 300 includes a game system 302 connected to a player system 304 across a network 306. The player interacts with a game (e.g., virtual golf game, virtual fishing game) on the player system 304. The game system 302 can be, for example, a single server or a group of multiple servers. The player system 304 can be implemented by a personal desktop computer, laptop computer, a game console, smart phone, cellular phone, personal digital assistant, or portable gaming device, to name a few examples. The network 306 can be a public or private, local or wide area network, such as the Internet. In some implementations, the system 302 and the player system 304 reside on the same computing device.
The game system 302 includes a player information database 308, a player information compiler 310, a player categorizer 312, a third party offer database 314, and an offer engine 316. The player information database 308 includes information about players using the player system 304. For example, the player information database 308 can include, for each player, the player information 100 (FIG. 1) and the usage profile 200 (FIG. 2).
The player information compiler 310 compiles information for a player, such as compiling the player information 100 and the usage profile 200. The player categorizer 312 can categorize players or their friends based on information in the player information database 308 or other information. For example, players can be categorized based on characteristics such as income, education level, profession, favorite brands used, favorite equipment used, and other game play characteristics.
The third party offer database 314 includes information about offers that can be presented to a player or the player's friends. The offer engine 316 can match offers in the third party offer database 314 to player categories identified by the player categorizer 312. For example, a luxury vacation offer can be matched to a category of players having an income at or above a threshold. The offer engine 316 can present an offer to one or more players included in a player category associated with the offer. For example, an offer can be presented through electronic or postal mail, and/or as an electronic message or advertisement presented on the player system 304 (e.g., during game play or while the player is accessing their game account). That is, an offer can be presented to the player in an electronic mail message that the player receives next time the player opens his or her e-mail account. Alternatively, the player can have a game account page, such as a profile page, where offers can be listed and kept until expiration. The game account page can be a page where the player can input or change his or her personal information, can provide information about the user's previous scores obtained when playing the game or can keep a list of friends with whom the player plays the game. Advertisements that are presented to the player, either in the game during game play or on the player's game account page can be related to offers that are presented to the player. Alternatively, the advertisements can be to encourage the player to either purchase items in the game or achieve particular goals within the game. Those achieved goals or purchased items that are based on the advertisements can then be fed into the determination of making a real world offer to the player. Offers can be for clothing, vacation packages, equipment, services, lessons, coupons, free or discounted golf play times, free or discounted buckets of golf driving range balls, court time, internet offers, brick and mortar store offers, tours, or entrance tickets, to name a few examples.
In some embodiments, offers are created in a database by an administrator who also creates rule sets and set parameters to trigger the offer. The offers are simply reward parameters, e.g., a 10% off coupon at a store, or a free virtual item, a gift certificate, or other such offer. A rule of the rule set determines when the offer is made and to whom. The rules can be based on any data or action that is collected in player information database 308, player information compiler 310, or player categorizer 312. An exemplary rule can require a player who is over 30 years old, has played a course in Florida and spent over $50 dollars in the game in the last 30 days. This rule can trigger offer X, such as a reduced round of golf at a specific golf course. The administrator for the rule set can determine how the offer is delivered, such as by electronic mail, immediately within the game, as an instant message, by postal service, etc. The rule can also include the duration the offer is available and the priority of the offer in case there is a situation where multiple offers are available to the player. Offers can also be designated as unique (offered only once per person) or non-combinable (can not be offered with other offers).
As noted, an offer can be made to the player within the game or during game play. For example, if the player of a golf game plays a hole particularly poorly, that is, lands in a water hazard or scores more than one triple Bogey in a row, the player may immediately receive an offer to upgrade their virtual golf club with a virtual golf club they can purchase that has better performance and is in a suitable price range for the player based on demographic information. As another example, if a team of players score particularly well on a hole, such as when a team in a scramble game of golf scores lower than another team for three holes in a row, the team members may be offered a buy one, get one free meal or drink offer at a local sports bar.
FIG. 4 illustrates an example of the player system 304. The player system 304 includes one or more processors 402 (e.g., IBM PowerPC®, Intel Pentium® 4, etc.), one or more display devices 404 (e.g., CRT, LCD), one or more graphics processing units 406 (e.g., NVIDIA® Quadro FX 4500, GeForce® 7800 GT, etc.), one or more network interfaces 408 (e.g., Ethernet, FireWire, USB, etc.), one or more input devices 410 (e.g., keyboard, mouse, game controller, camera, microphone, etc.), and one or more computer-readable mediums 412 (e.g. SDRAM, optical disks, hard disks, flash memory, L1 or L2 cache, etc.). These components can exchange communications and data via one or more buses 414 (e.g., EISA, PCI, PCI Express, etc.).
The term “computer-readable medium” refers to any medium that participates in providing instructions to a processor 402 for execution, including without limitation, non-volatile media (e.g., optical or magnetic disks), volatile media (e.g., memory) and transmission media. Transmission media includes, without limitation, coaxial cables, copper wire and fiber optics. Transmission media can also take the form of acoustic, light or radio frequency waves. The computer-readable medium 412 further includes an operating system 416 (e.g., Mac OS®, Windows®, Linux, etc.), a network communication module 418, computer game assets 420, and a computer game application 422.
The network communication module 418 can provide processing that allows the player system 304 to communicate with other systems (e.g., the game system 302) across the network 306, using the network interface(s) 408. The computer game application 422 further includes a game engine 424, a player information engine 426, a usage monitor 428 and one or more input models 430. The player information engine 426 can gather and store, for example, the player information 100 during game play or at other times. The usage monitor 428 can monitor a player's game play, and can, for example, store usage information in the usage profile 200. The input model 430 interprets user input to the game, such as user interaction with a game controller, user gestures and body movements, and other input, and provides such input to the game engine 424. In some implementations, the electronic game application 422 can be integrated with other applications 434 or be configured as a plug-in to other applications 434.
The operating system 416 can be multi-player, multiprocessing, multitasking, multithreading, real-time and the like. The operating system 416 performs basic tasks, including but not limited to: recognizing input from input devices 410; sending output to a display device 404; keeping track of files and directories on computer-readable media 412 (e.g., memory or a storage device); controlling peripheral devices (e.g., disk drives, printers, GPUs 406, etc.); and managing traffic on the one or more buses 414. The network communications module 418 includes various components for establishing and maintaining network connections (e.g., software for implementing communication protocols, such as TCP/IP, HTTP, Ethernet, etc.). The application 422, together with its components, implements the various tasks and functions, as described with respect to FIGS. 1-3 and FIGS. 5-6.
The player system 304 can be implemented in any electronic or computing device capable of hosting the application 402, or part of the application 402, including but not limited to: portable or desktop computers, workstations, main frame computers, personal digital assistants, portable game devices, mobile telephones, network servers, etc. All of these component can by physically remote to each other.
While FIGS. 3 and 4 depict an example configuration of systems 300 and 304, other configurations are possible including configurations where some or all of the system 300 components are part of the player system 304, and vice versa. By way of illustration, the computer game application 422 can largely be part of the system 302 and be replaced with a light weight version of itself, such as an Adobe Flash presentation that runs in a web browser on the player system 304 and communicates with the system 302 through the network 306.
FIG. 5 illustrates one embodiment of a game player reward determination process 500. Player information and a game play usage profile are compiled for a player (step 502). For example, the player information compiler can, for a player, compile player information (e.g., personal information, demographics) and game usage information (e.g., types and brands of gear and clothing used, length and frequency of use of items, goals worked towards).
The player is categorized (step 504). For example, the player can be categorized by the player categorizer 312 (FIG. 3) into one or more categories. For example, categories can include players with flexible work schedules, players with luxury goods interest, players with an affinity for a particular brand, players with an interest in particular geographic locations, players with an interest in extreme sports or adventure, etc.
An offer appropriate for the player is determined, based on the category (step 506). For example, one or more offers can be determined, as appropriate for the player. Offers can be provided by third parties, such as clothing and gear manufacturers, retailers, travel agencies, or other service providers, to name a few examples. In some embodiments, rules are compiled in the database periodically, such as every few hours, and players matching the criteria are made an offer at the time that the rules are compiled. Some rules, such as ones that are triggered by a single in-game event, can be run in real time. An exemplary real time rule can be a rule that states that if a player purchases a specific branded virtual item, the player is provided an e-mail offer for a 10% discount on the same item in a real world, or brick and mortar, store.
The player is provided with the offer (step 508). The offer can be provided, for example, through electronic short message service (SMS), or regular postal mail, other suitable method of offer delivery or can be presented to the player while the player is using a game system. In-game communication systems can include instant messaging, the administrator posting a message on the player's home page or group chat where offers can be communicated.
FIG. 6 illustrates an alternative process for determining and presenting an offer to a game player. Demographic information about a player is received or retrieved (step 602). Game play information is received or retrieved (step 604). Demographic and game play information is compiled to create a profile (step 606).
The profile is provided to a third party to determine the real world offer. A real world offer is determined for players matching the profile (step 608). For example, the profile can be provided to a travel company, and the travel company can determine a travel offer based, for example, on a player's profession, income and game play usage.
The third party offer is received from the third party (step 610). For example, the offer can be received electronically from a third party system, or the offer can be received in a paper form.
The real world offer is provided to the player (step 612). For example, the real world offer can be sent to the player through electronic or postal mail, and/or the offer can be presented electronically to the player while the player is interacting with a game system.
Although virtual golf and virtual fishing games have been described, the ideas presented herein can be applied to any game or activity, including games and activities (e.g., virtual tennis, baseball, horse racing, skiing, car racing, etc.).
Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flowscan also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a player, embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the player and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the player can provide input to the computer. Other kinds of devices can be used to provide for interaction with a player as well; for example, feedback provided to the player can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the player can be received in any form, including acoustic, speech, brain waves, other physiological input, eye movements, gestures, body movements, or tactile input.
Embodiments of the invention can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical player interface or a Web browser through which a player can interact with an implementation of the invention, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what can be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features can be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination can be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing can be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results.


Last edited by Paul on Wed 30 Mar 2022, 11:44 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Photographic mapping in a simulation

Post by Paul Sun 27 Mar 2022, 10:40 pm

Abstract



Methods and apparatus, including computer program products, for determining a location for a virtual object on a course terrain for a course. A photographic image of the course corresponding to the location is identified. The virtual object is incorporated in a presentation of the photographic image such that the virtual object appears in the photographic image.



Images (13)


VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00000 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00001 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00002 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00003 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00004 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00005 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00006 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00007 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00008 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00009 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00010 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00011 VEM VARIOUS PATENTS AND PURPOSES US07847808-20101207-D00012

Classifications



 


A63F13/655 Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player  
View 7 more classifications


US7847808B2


United States



Download PDF   Find Prior Art  
Similar
InventorYuchiang ChengChad M. NelsonDavid Castelnuovo Current Assignee  World Golf Tour Inc


Worldwide applications
2007   US EP WO JP  


Application US11/779,787 events

2006-07-19

Priority to US83204706P
2007-07-18

Application filed by World Golf Tour Inc
2008-01-24

Publication of US20080018667A1
2010-12-07

Application granted
2010-12-07

Publication of US7847808B2
Status

Active
2027-07-18

Anticipated expiration
Show all events





InfoPatent citations (34)Non-patent citations (37)Cited by (30)Legal eventsSimilar documentsPriority and Related ApplicationsExternal linksUSPTOUSPTO PatentCenterUSPTO AssignmentEspacenetGlobal DossierDiscuss  

Description



CROSS REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Patent Application No. 60/832,047, entitled “Photographic Mapping in a Simulation”, to YuChiang Cheng, et al., which was filed on Jul. 19, 2006; the disclosure of the prior application is considered part of (and is incorporated by reference in) the disclosure of this application. This application is related to pending U.S. application Ser. No. 11/407,163, entitled “Automatically Adapting Virtual Equipment Model,” filed on Apr. 18, 2006, the entire contents of which are hereby incorporated by reference.
BACKGROUND
The present disclosure relates to using photographs in computer simulations.
Computer games and other types of simulations recreate real world environments such as baseball diamonds, race tracks, and golf courses through three dimensional (3D) computer generated graphics. However, such graphics can typically create unnatural visual artifacts such as repeating patterns which detract from the intended realism of the imagery. Some computer games may use a photograph of an actual location as a background, such as a mountains, with computer generated graphics rendered in the foreground. However, there may not be any interaction between the computer generated graphics and the terrain represented by the photograph.
SUMMARY
In general, in one aspect, embodiments of the invention feature determining a location for a virtual object on a course terrain for a course. A photographic image of the course corresponding to the location is identified. The virtual object is incorporated in a presentation of the photographic image such that the virtual object appears in the photographic image at a second location corresponding to the first location.
These and other embodiments can optionally include one or more of the following features. Interaction of the virtual object with one or more portions of the course terrain can be simulated. A visual representation of the interaction can be incorporated into the presentation of the first photographic image. The photographic image can be pre-fetched based on a history associated with a user or a group of users. User input can be accepted to cause interaction of the virtual object with one or more portions of the course terrain. In response to movement of the virtual object: a second location for a virtual object on a course terrain for a course can be determined; a second photographic image of the course corresponding to the second location can be identified; and the virtual object can be incorporated in a presentation of the second photographic image. The course terrain can be mapped to the photographic image. The location can be a real-world location on the course.
An avatar or a piece of virtual equipment can be incorporated into the presentation of the photographic image. The virtual object can be incorporated in a presentation of the photographic image such that the virtual object is animated in the photographic image. A visual effect can be incorporated into the presentation of the photographic image. The course can be one or more of: a golf course, a baseball diamond, a track, a tennis court, one or more roadways, or open terrain. The virtual object can be one of: a golf ball, a baseball, a tennis ball, an automobile, a bicycle, an animal, an avatar, a motorcycle, or a flying craft. The photographic image can be a composition of two or more photographs of the course. A photograph can be identified from a plurality of photographs where a target location is nearest to the horizontal center of the photograph. The terrain can be altered to account for distortion in the photographic image. The virtual object can be scaled in the presentation of the photographic image. The location can be projected in a three dimensional space to the second location in a two dimensional space.
In general, in another aspect, embodiments of the invention feature determining one or more potential locations for a virtual object on a course terrain for a course, the potential location based on a user history. A photographic image of the course corresponding to each potential location is identified. Each of the identified photographic images is obtained.
These and other embodiments can optionally include one or more of the following features. The virtual object can be incorporated in a presentation of one of the obtained photographic images.
In general, in another aspect, embodiments of the invention feature dividing a course into cells. A target point for the course are determined. Camera parameters for each cell are determined based on a size of the cell and the target point.
These and other embodiments can optionally include one or more of the following features. A shot list can be generated based on the determined camera parameters. Cell density can be altered based on course features.
Particular embodiments of the invention can be implemented to realize one or more of the following advantages. Players can be provided the experience of playing on a real course because of the integration of actual photographs of the course into the game play. Photographs can be pre-fetched based on one or more player's history to improve the performance of the game or simulation. Virtual objects can be integrated at the proper location and with the proper scale into actual photographs such that the player has the impression the virtual objects were actually photographed on the course. A course can be manually or automatically be divided into a grid of potentially varying density and a shot list can be automatically generated for the grid.
The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the invention will become apparent from the description, the drawings, and the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
FIGS. 1A-C illustrates a graphical user interface for a computer golf game that incorporates photographs of an actual golf course into the game play.
FIG. 2A is a flow diagram of a technique for photographic mapping in a simulation such as a computer game.
FIG. 2B is a flow diagram of a technique for pre-fetching photographic images for mapping in a simulation such as a computer game.
FIG. 3A illustrates a course grid.
FIG. 3B illustrates how photograph parameters are derived from a cell in a course grid.
FIG. 3C is an actual course photograph of a 25′6″×25′6″ cell.
FIG. 3D is an actual course photograph of a 10′3″×10′3″ cell.
FIG. 4 is a flow diagram illustrating a technique for automatically dividing a course into cells and generating a shot list.
FIG. 5 is an illustration of 3D terrain data for a course, including a course grid overlay.
FIG. 6A is a flow diagram illustrating a technique for incorporating a visual representation of virtual objects into a photograph.
FIG. 6B is an illustration of 3D mapping.
FIG. 7 illustrates a photographic mapping system.
Like reference numbers and designations in the various drawings indicate like elements.
DETAILED DESCRIPTION
Various implementations recreate the experience of playing on a course (e.g., a golf course, a baseball diamond, a race track) utilizing actual photographs of the course combined with simulated two dimensional (2D) and 3D graphics.
Computer games and other types of simulations typically include a virtual universe that players interact with in order to achieve one or more goals, such as shooting all of the “bad” guys or playing a hole of golf. Typical computer game genres include role-playing, first person shooter, third person shooter, sports, racing, fighting, action, strategy, and simulation. A computer game can incorporate a combination of two or more genres. Computer games are commonly available for different computer platforms such as workstations, personal computers, game consoles (e.g., Sony PlayStation and PlayStation Portable, Microsoft Xbox, Nintendo GameCube and Game Boy), cellular telephones, portable media players, and other mobile devices. Computer games can be single player or multi-player. Some multiplayer games allow players connected via the Internet to interact in a common or shared virtual universe.
A virtual universe is the paradigm with which the user interacts when playing a computer game and can include representations of virtual environments, objects, characters, and associated state information. For instance, a virtual universe can include a virtual golf course, golfers, golf clubs and golf balls. A virtual universe and its virtual objects can change as users achieve goals. For example, in action games as users advance to higher game levels, typically the virtual universe is changed to model the new level and users are furnished with different virtual equipment, such as more powerful weapons.
Players typically interact with one or more virtual objects in a virtual universe, such as an avatar and virtual equipment, through a user interface. A user interface can accept input from all manner of input devices including, but not limited to, devices capable of receiving mouse input, trackball input, button presses, verbal commands, sounds, gestures, eye movements, body movements, brain waves, other types of physiological sensors, and combinations of these. A click of a mouse button, for example, might cause a virtual golf club to swing and strike a virtual golf ball on a virtual golf course.
FIG. 1A illustrates a graphical user interface (GUI) 100 for a computer golf game that incorporates photographs of an actual golf course (e.g., 102 a) into the game play. Various visual representations of virtual objects have been integrated into the presentation of the photograph 102 a, including an avatar 104 representing the player, a piece of virtual equipment 112 representing a golf club, and a piece of virtual equipment 108 representing a golf ball. The player provides user input to the computer game which reacts by altering the state of the game's virtual universe based on the input and interaction of virtual objects in the virtual universe.
For example, player input can cause the avatar 104 to appear to hit the ball 108 with the club 112 towards the end of the green. A game engine can simulate the physics of the ball 108's aerial trajectory and eventual interaction with (e.g., bouncing and rolling) a golf course terrain in the virtual universe that represents the golf course. The terrain can be a model of the topography of an actual golf course, for example. As will be described below, the new location of the ball 108 in the 3D virtual golf course and, optionally, other virtual objects, are mapped to corresponding 2D locations in the photograph 102 a, or a different photograph, so that the virtual objects appear in the proper place, and at the proper scale, in the photograph as though they were actually photographed. In this way, the player is provided the experience of playing on an actual golf course.
FIG. 2A is a flow diagram 200 of a technique for photographic mapping in a simulation such as a computer game. User input is optionally obtained which causes a virtual object (e.g., golf ball 108) that is being tracked for purposes of triggering photographic mapping to interact with one or more portions of a 3D course terrain to be simulated (step 202). More than one virtual object can be tracked. Based on the simulation, a new location for the virtual object(s) on the course terrain is determined (step 204). For example, a game engine can simulate the physics of a golf ball trajectory and the ball's eventual impact with a golf course terrain.
A photographic image corresponding to the virtual object's new location on the course is identified (step 206). If there is more than one virtual object being tracked, photograph corresponding to an area of the terrain encompassing the location of all of the tracked virtual objects is identified. In one implementation, where there are multiple photographs covering a given course location, the photograph that provides the best view from the player's perspective is chosen. For example, a photograph which is closest to centering on the location of the virtual object's new location would be chosen. Alternatively, more than one photograph of the virtual object's new location can be digitally stitched together to form a single, composite photograph. The virtual object(s) are incorporated into the photographic image (e.g., using 3D mapping; step 208).
FIG. 2B is a flow diagram 201 of a technique for pre-fetching photographic images for mapping in a simulation such as a computer game. Pre-fetching photographic images can improve performance of real-time games and other simulations by caching photographs ahead of time before they need to be presented. This is especially true if the images must be retrieved from remote storage. First, one or more potential locations for a virtual object on a course terrain are determined based on a user's playing history or on the playing history of a group of users for that particular part of the course (step 203). Playing history can include information identifying past locations of virtual objects in the terrain for the user(s) and measures of the user(s)'game playing abilities. A photographic image of the course corresponding to each potential location is then identified (step 205). The identified photographs are then pre-fetched (e.g., cached) so as to be ready for possible incorporation into the game play (step 207). One or more virtual objects are then incorporated into one of the obtained images (step 209) based on the new location of the virtual object.
Some or all of a tracked virtual object's simulated movement in the virtual universe can also be animated in an actual course photograph. For example, after the player strikes the golf ball 108 in photograph 102 a (as shown in FIG. 1A), photograph 102 b (as shown in FIG. 1B) can be presented to the player along with animation of the ball 108 falling from the sky at location 108 a, impacting the golf course at location 108 b, and rolling to resting location 108 c. If the ball 108 were to continue rolling beyond the edge of the photograph 102 b, a new photograph corresponding to the ball 108's new location could be displayed. This can continue until the ball 108 comes to rest. Alternatively, only the last such photograph (i.e., the photograph of the virtual object's resting place) need be presented. Visual representations of other virtual objects can also be animated in the photograph 102 a. For example, the avatar 104 can be animated such that the avatar 104 swings the golf club 112 and reacts to the swing.
Additional graphical information to help the player can be incorporated into the photograph and the GUI 100. As shown in FIG. 1C, a directional aiming arrow 120 is provided to assist the player in setting up a shot. An animated arc 122 can be drawn on the photograph to show the player the path the golf ball 108 will take in the air and on the course. Alternately, the arc 122 can be drawn as the golf ball 108 moves in the photograph 102 c. Two status areas 122 a-b are incorporated into the GUI 100 to provide information such as the current location in the virtual course, the player score, distance to the hole or pin 106, wind speed and direction, and the virtual club the player is using.
In order to systematically photograph an actual course (e.g., a track, a golf course, a baseball diamond, a football field, a tennis court, one or more roadways) for use in computer game or other simulation, the course can be manually or automatically divided into a grid of cells. Each cell defines a physical area of the course that will be photographed for use in the simulation. Each cell can have one or more photographs associated with the cell. In one implementation, a cell is associated with a single photograph which encompasses the area of the course corresponding to the area of the cell. FIG. 3A is an illustration of a golf course 300 so divided. A course can be of any size and shape, and can include non adjacent areas. Likewise, cells can have different sizes, shapes and do not have to be adjacent to one another. Depending on the portion of the course they cover, cell density can change. In one implementation, cell density increases in course areas where players are more likely to interact with virtual objects.
In the world of golf, for example, these areas would be putting greens (e.g., 302), tee boxes (e.g., 306 a-d) and nuances such as sand traps (e.g., 304 a-d) and obstructions such as trees that golfers must circumnavigate. In other areas of the course, cell density is decreased meaning fewer course photographs need to be taken. In one implementation, lower density cell areas have a lower frequency of balls landing in them, require a wider area of visibility for the player, or both. In one implementation, image recognition software can be used to identify such areas of a course based on recognizing certain visible features (e.g., putting greens, sand traps, trees). By identifying areas of a course as having a high or low probability of player interaction, a course can be automatically divided regions having appropriate cell densities.
In one implementation, a course can have more than one layer of cells. The need for this may arise, for instance, to handle situations when a player, due to accident or poor skills, causes a virtual object being tracked for photographic mapping purposes to be located in a part of the course that rarely sees play. In FIG. 3A, small cell 308 b for tee box 306 a is the cell used by default for a photograph at this stage of the course since most players are able to hit the ball quite a distance up the fairway. However, some players may cause the ball to land in close proximity to the tee box 306 a. The area just outside of the tee box 306 a is not included in the photograph for cell 308 b. However, secondary cell 308 a overlaying the default cell 308 b can be used to obtain a photograph when the ball lies within the bounds of cell 308 a. The photograph for the secondary cell 308 a encompasses the tee box 306 a and the surrounding area. A layer can be chosen based on rules or heuristics that can depend on the state of the virtual universe at a particular point in time. In one implementation, a layer is chosen based on which provides the smallest cell size. In another implementation, a layer can be chosen based on a required style of presentation. For example, it may be desirable to show a ball in flight passing through a cell for dramatic effect.
As discussed above, in one implementation each cell in a course grid is photographed such that the photograph encompasses the area of the course in the cell. For example, the photograph shown in FIG. 3C is of a 25′6″×25′6″ cell indicated by the white boundary 301. Two avatars (104 a-b) have been rendered in the photograph to illustrate how the scale of virtual objects change based on their position in the photograph. How this is accomplished is described below. The photograph is taken by a camera at a specified 3D position (longitude, latitude, and altitude) in the actual course. A camera's 3D position can be determined by use of a Global Positioning System (GPS) or a ground based navigation system, for example. Since the position of each cell is known, the camera position can be specified as a setback distance from the cell and a height above the cell. In the photograph of FIG. 3C, the camera is positioned 29′6″ away from the cell and a height of 10′3″ above the cell. A 24 mm lens was used. FIG. 3D is photograph of a 10′3″×10′3″ cell where the camera was positioned at a setback of 12′6″ and a height of 5′6″, and using a 18 mm lens.
FIG. 3B is a portion of the course 300 illustrating determination of camera position and direction for a cell. In one implementation, a camera's position and direction can be determined based on a target position for a given cell. In golf, for example, generally the target will be the hole unless the fairway turns such that players must aim for the turn in order to set up a shot for the hole. In this later case, the target would be the turning point in the fairway. The target for cells 310 a and 310 b is hole 302. A line passes through the center of each cell to the target. It is along this line that the camera lens will point towards the target. The location of the camera on the course will be along the line and outside of the cell. Cell 310 a's camera is located at position 312 a along the line defined by endpoints 312 a and 302. Likewise, cell 310 b's camera is located at position 312 b along the line defined by endpoints 312 b and 302.
In one implementation, the focal length of the lens, the angle of the lens, the offset of the camera from the edge of the cell, and the height of the camera, i.e., the camera parameters, can be predefined for a given cell size. In another implementation, one or more of the focal length, the angle of the lens, and the 3D location of the camera can be dynamically determined. By way of illustration, such a determination can take into account the topology of the cell. If a given cell was in a valley, for instance, it could be beneficial to provide more of an overhead shot so that a player does not lose perspective with the surrounding course area.
FIG. 4 is a flow diagram illustrating a technique 400 for automatically dividing a course into cells and generating a shot list. Since a course can be automatically divided into cells and since camera parameters for each cell can be automatically determined, a so-called shot list can be automatically determined. A shot list is a list of photographs that need to be taken for cells in a given course. Each shot includes a 3D location of the camera, lens focal length, direction, and angle. A course is initially divided into cells as described above (step 402). One or more target points are determined for the course (e.g., 302; step 404). Camera parameters are determined for each cell based on the target point(s) and/or cell size (step 406). Finally, a shot list is generated describing the camera requirements required to photograph each cell on the course. In a further implementation, the shot list can be downloaded to a robotic device with an attached camera such as a robotic helicopter capable of hovering at precise 3D coordinates. The robotic device can then carry out capturing photographs for one or more of the cells.
FIG. 5 is an illustration of 3D terrain data for a course, including a course grid overlay. A terrain represents elevation data for a course which provides a 3D digital elevation map (e.g., terrain mesh) of features on the course. The terrain is used by the game engine to simulate how virtual objects physically interact with the course and where the virtual objects appear in photographs of the course. Each cell (e.g., 502) maps to a portion of the course terrain. In one implementation, the digital elevation map is accurate to within a centimeter, however higher and lower resolutions are possible. Terrain data can be collected in a number of ways including, but not limited to, aerial photogrammetric mapping (APM), laser 3D imaging and GPS-real-time kinemetric (GPS-RTK) surveys.
FIG. 6A is a flow diagram illustrating a technique for incorporating a visual representation of virtual objects into a photograph. As described above, a game or simulation engine determines the location a tracked virtual object in 3D virtual course space (i.e., on or above the terrain). A terrain area in which the virtual object is located is identified (step 602). Next, the camera that took the photograph for the cell covering the terrain area is simulated (step 604). As shown in FIG. 6B, a virtual camera 603 simulates the exact view 605 of the actual camera based on known parameters of the camera (e.g., the 3D position of the camera, the angle and direction lens, and the focal length of the lens). Using a 3D perspective projection, the virtual object(s) (e.g., ball 108) in the 3D virtual course space are projected into the 2D viewing plane 605 of the simulated camera 603 (step 606). A perspective projection ensures that virtual objects that are farther from the virtual camera will appear smaller in relation to those that are closer to the virtual camera, thus adding to the sense of realism. In one implementation, the projection can compensate for visual distortion in the camera lens. The virtual objects in the 2D projection are then incorporated into the actual photograph of the cell (e.g., 102 b; step 608). Additional virtual objects (e.g., avatars, virtual equipment) can also be dynamically included to the projection even though the position of these objects may not being used to trigger photographic mapping.
FIG. 7 illustrates a photographic mapping system 700 for a computer game application or other simulation. The functionality encompassed in the system 700 can be distributed to fewer or more components than those illustrated. Moreover, components can be distributed to one or more devices that are capable of communicating over one or more commercial or proprietary networks or other communication interfaces. A game engine 706 maintains state for the virtual universe 708 based on user input and the interaction of objects in the virtual universe 708. An input model 702 maps user inputs (e.g., button presses, voice commands, sounds, gestures, eye movements, body movements, brain waves, other types of physiological sensors, and combinations of these) to one or more variable values or signals for processing by the game engine 706.
The game engine 706 can include a renderer for rendering views of the virtual universe that can be provided to the representation component 704 for presentation on a display device, for example. A representation component 704 receives audio/video rendering data from the game engine 706, and other rendering information including haptic data, and provides such to one or more output devices such as display devices, sound generation devices, haptic feedback devices, and other suitable devices.
The game engine 706 can include artificial intelligence capabilities for determining one or more future states of the virtual universe. The game engine 706 can also have the ability to simulate the physics of virtual objects interacting with a course terrain, for example. Virtual objects in the virtual universe 708 are associated with assets 712 (e.g., content, models, sounds, physics, artificial intelligence). Assets are used by the game engine 706 to represent virtual objects and render the computer game. Other assets include actual course photographs which are dynamically incorporated into the game GUI 100, as described above.
The game engine 706 employs a photographic mapping component 710 to identify photographs corresponding to the current location of game play and map one or more virtual objects from the 3D virtual course space into a viewing plane for incorporation into a photograph of the actual course. In one implementation, course photographs are tagged with data identifying the cell or course terrain region the photographs correspond to.
Embodiments of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, embodiments of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
Embodiments of the invention can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results.


Last edited by Paul on Wed 30 Mar 2022, 11:43 pm; edited 2 times in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Electronic game utilizing photographs PART 1

Post by Paul Sun 27 Mar 2022, 10:47 pm

The present disclosure includes, among other things, methods and apparatus, including computer program products, for providing an electronic game utilizing photographs.



Images (20)


VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00000 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00001 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00002 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00003 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00006 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00007 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00008 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00009 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00010 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00011 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00013 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00014 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00015 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00016 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00017 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00018 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00019 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00022 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00025 VEM VARIOUS PATENTS AND PURPOSES US20080293488A1-20081127-D00029

Classifications



 

A63F13/65 Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition  

A63F13/10  

A63F13/812 Ball games, e.g. soccer or baseball  

A63F2300/69 Involving elements of the real world in the game world, e.g. measurement in live races, real video  


A63F2300/8011 Ball  
Hide more classifications


US20080293488A1


United States



Download PDF   Find Prior Art  
Similar
InventorYuchiang ChengChad M. NelsonDavid MontgomeryPhil GorrowDavid Castelnuovo Current Assignee  World Golf Tour Inc


Worldwide applications
2008   US WO US US TW US  


Application US12/154,311 events

2007-05-21

Priority to US93931207P
2008-05-21

Application filed by World Golf Tour Inc
2008-11-27

Publication of US20080293488A1
Status

Abandoned
Show all events





InfoPatent citations (28)Cited by (73)Legal eventsSimilar documentsPriority and Related ApplicationsExternal linksUSPTOUSPTO PatentCenterUSPTO AssignmentEspacenetGlobal DossierDiscuss  

Description



    CROSS REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application claims priority to U.S. Provisional Application Ser. No. 60/939,312, entitled “Integrating Objects in Three-Dimensional Space into Photographs,” filed on May 21, 2007, the entire contents of which are hereby incorporated by reference.
    BACKGROUND
  • [0002]
    Electronic games and other types of simulations recreate real world environments such as baseball diamonds, race tracks, and golf courses through three dimensional (3D) computer generated graphics. However, such graphics can typically create unnatural visual artifacts such as repeating patterns which detract from the intended realism of the imagery. Some electronic games may use a photograph of an actual location as a background, such as mountains, with computer generated graphics rendered in the foreground. However, there may not be any interaction between the computer generated graphics and the terrain represented by the photograph.
    SUMMARY
  • [0003]
    In general, one aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes selecting a prior state of an interactive electronic game from a plurality of prior states, the prior state identifying user input previously provided to the electronic game and a set of values representing a condition of the electronic game before the user input was processed by the electronic game. A current condition of the electronic game is set according to the set of values and providing the user input to the electronic game. A new set of values corresponding to a new condition of the electronic game is obtained by processing of the user input by the electronic game based on the current condition and the set of values. A sequence of one or more photographic images is selected based on the new set of values. Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
  • [0004]
    These and other implementations can optionally include one or more of the following features. The interactive electronic game simulates a game of skill. The interactive electronic game is a first-person shooter game. Selecting the prior state includes basing the selection on a received identifier of the prior state. The identifier is part of a message sent over one or more computer networks. The new set of values includes a three-dimensional path of a virtual object relative to a physical terrain. The method can further include selecting the sequence of one or more photographic images based on the path. The method can further include incorporating a representation of a virtual object into one or more photographic images in the sequence of one or more photographic images based on the new set of values. The method can further include receiving input indicating shot preferences; and selecting the sequence of one or more photographic images based on the shot preferences.
  • [0005]
    In general, another aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a three-dimensional path relative to a model of a physical terrain for a physical course, and where a plurality of areas of the physical course are captured by one or more two-dimensional photographic images. Which of the physical course areas are on the path is determined. A sequence of one or more photographic images having a view of the physical course areas on or about the path is selected. Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
  • [0006]
    These and other implementations can optionally include one or more of the following features. The path at least partially lies on the physical terrain. The model is a topology of the physical course. Two or more of the areas overlap each other. Determining the three-dimensional path includes modeling the physics of a virtual object's interaction with the model of the physical terrain. The model of the physical terrain includes one or more obstacles rising vertically from the terrain and determining the three-dimensional path includes modeling the physics of the virtual object's interaction with the one or more obstacles. Each photographic image is associated with a priority and where selecting the sequence of one or more photographic images is based on the associated priorities. Selecting a sequence of one or more photographic images includes determining if two or more first photographic images have a view of an area on or about the path; and selecting the first photographic image with the highest priority. Determining which of the physical course areas are on the path includes determining if the path lies on or over a portion of the model of the physical terrain that is captured by a two-dimensional photographic image. Selecting a sequence of one or more photographic images is governed by a script.
  • [0007]
    Particular implementations of the invention can be implemented to realize one or more of the following advantages. Players are provided the experience of playing on a real course because of the integration of actual photographs of the course into the game play. Photographs can be pre-fetched based on one or more player's history to improve the performance of the game or simulation. Virtual objects are integrated at the proper location and with the proper scale into actual photographs such that the player has the impression the virtual objects were actually photographed on the course. Representations of real world objects in the photographs can be assigned characteristics similar to the characteristics that the real world objects have, such as hardness, elasticity, friction, and the ability to change or slow the trajectory of a virtual object that interacts with the real world object. The representations of real world objects can also be made to obscure the virtual object when the virtual object would be hidden behind the real world object. Creating a course terrain with attributes allows virtual objects to interact with objects in the terrain in a natural way and provide a more realistic presentation of the game to a player. A course can be manually or automatically divided into a grid of potentially varying density and a shot list can be automatically generated for the grid. Shot sequences are automatically determined based on a number of factors. Games can be replayed and replay information can be shared with other users.
  • [0008]
    The details of one or more implementations of the invention are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the invention will become apparent from the description, the drawings, and the claims.
    BRIEF DESCRIPTION OF THE DRAWINGS
  • [0009]
    The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
  • [0010]
    FIGS. 1A-C illustrate an example graphical user interface for a computer golf game that incorporates photographs of an actual golf course into the game play.
  • [0011]
    FIG. 2A is a flowchart of an example technique for photographic mapping in a simulation such as an electronic game.
  • [0012]
    FIG. 2B is a flowchart of an example technique for pre-fetching photographic images for mapping in a simulation such as an electronic game.
  • [0013]
    FIG. 3A illustrates an example course grid.
  • [0014]
    FIG. 3B illustrates an example of how photograph parameters can be derived from a cell in a course grid.
  • [0015]
    FIG. 3C is an actual course photograph of a 25+6″×25′6″ cell.
  • [0016]
    FIG. 3D is an actual course photograph of a 10′3″×10′3″ cell.
  • [0017]
    FIG. 4 is a flowchart illustrating an example technique for automatically dividing a course into cells and generating a shot list.
  • [0018]
    FIG. 5A is an illustration of an example of a course terrain.
  • [0019]
    FIG. 5B1 shows an example delineation of surface types assigned to a photograph.
  • [0020]
    FIG. 5B2 is a flowchart of an example technique for assigning surface types to objects in a photograph.
  • [0021]
    FIG. 5C1 is a photograph of a golf fairway with trees.
  • [0022]
    FIG. 5C2 is a flowchart of an example technique for illustrating how the real world objects obtain collision properties.
  • [0023]
    FIG. 5D shows an example location of tree trunks and palm fronds in the trees in FIG.
  • [0024]
    FIG. 5E is a ball trajectory of an example golf ball hitting the palm fronds of FIG. 5C1.
  • [0025]
    FIG. 5F is a ball trajectory of an example golf ball hitting a trunk of FIG. 5C1.
  • [0026]
    FIG. 5G is an overhead view of an example hazard area on a golf course.
  • [0027]
    FIG. 5H shows an example location of bushes and ground cover in the hazard area.
  • [0028]
    FIG. 5I is a photograph of an example golf hole with trees.
  • [0029]
    FIG. 5J is an example representation of the trees in the photograph.
  • [0030]
    FIG. 5K is an example representation of a virtual ball in front of the trees.
  • [0031]
    FIG. 5L is an example representation of a virtual ball's path leading behind a tree.
  • [0032]
    FIG. 5M is an example representation of a virtual ball's path leading behind a ridge.
  • [0033]
    FIG. 5N is a flow chart illustrating how a virtual object can be displayed during play.
  • [0034]
    FIG. 5O is a flowchart illustrating an example use of the attributes assigned to the real world images.
  • [0035]
    FIG. 5P is a flowchart illustrating an example method of representing the movement of a virtual object.
  • [0036]
    FIG. 6A is a flowchart illustrating an example technique for incorporating a visual representation of virtual objects into a photograph.
  • [0037]
    FIG. 6B is an illustration of an example 3D mapping.
  • [0038]
    FIGS. 7A-C are diagrams illustrating example client-server architectures.
  • [0039]
    FIG. 7D is a schematic diagram of an example client.
  • [0040]
    FIG. 7E is an overhead view of an example virtual course illustrating cells along a virtual object path.
  • [0041]
    FIG. 7F is a profile view of an example virtual object path in relation to a model of a physical terrain.
  • [0042]
    FIG. 7G is a flowchart illustrating an example technique for shot selection.
  • [0043]
    FIG. 7H is a schematic diagram of an example server.
  • [0044]
    FIG. 7I is a flowchart of an example method for replaying a simulation.
  • [0045]
    FIG. 7J is an illustration of an example swing meter.
  • [0046]
    Like reference numbers and designations in the various drawings indicate like elements.
    DETAILED DESCRIPTION
  • [0047]
    Various implementations recreate the experience of playing on a course (e.g., a golf course, a baseball diamond, a race track) utilizing digital representations of actual photographs of the course combined with computer generated two dimensional (2D) and 3D graphics, animation and effects.
  • [0048]
    Electronic games and other types of simulations typically include a virtual universe that players interact with in order to achieve one or more goals, such as shooting all of the “bad” guys or playing a hole of golf. Typical electronic game genres include role-playing, first person shooter, third person shooter, sports, racing, fighting, action, strategy, and simulation. An electronic game can incorporate a combination of two or more genres. Electronic games are commonly available for different computer platforms such as workstations, personal computers, game consoles (e.g., Sony PlayStation and PlayStation Portable, Microsoft Xbox, Nintendo GameCube, Game Boy and Wii), cellular telephones, portable media players, and other mobile devices. Electronic games can be single player or multi-player. Some multi-player games allow players connected via the Internet to interact in a common or shared virtual universe.
  • [0049]
    A virtual universe is the paradigm with which the user interacts when playing an electronic game and can include representations of virtual environments, objects, characters, and associated state information. For instance, a virtual universe can include a virtual golf course, golfers, golf clubs and golf balls. A virtual universe and its virtual objects can change as users achieve goals. For example, in action games as users advance to higher game levels, typically the virtual universe is changed to model the new level and users are furnished with different virtual equipment, such as more powerful weapons.
  • [0050]
    Players typically interact with one or more virtual objects in a virtual universe, such as an avatar and virtual equipment, through a user interface. A user interface can accept input from all manner of input devices including, but not limited to, devices capable of receiving mouse input, trackball input, scroll wheel input, button presses, verbal commands, sounds, gestures, eye movements, body movements, brain waves, other types of physiological sensors, and combinations of these. A click of a mouse button, for example, might cause a virtual golf club to swing and strike a virtual golf ball on a virtual golf course.
  • [0051]
    FIGS. 1A illustrates an example graphical user interface (GUI) 100 for a computer golf game that incorporates digital representations of photographic images of an actual golf course (e.g., 102 a) into the game play. Various visual representations of virtual objects have been integrated into the presentation of the photograph 102 a, including an avatar 104 representing the player, a piece of virtual equipment 112 representing a golf club, and a virtual object 108 representing a golf ball. The player provides user input to the electronic game which reacts by altering the state of the game's virtual universe based on the input and interaction of virtual objects in the virtual universe. The game's state at a point in time can be represented by a set of values.
  • [0052]
    For example, player input can cause the avatar 104 to appear to hit the ball 108 with the club 112 towards the end of the green. A game engine can simulate the physics of the ball 108's aerial trajectory and eventual interaction (e.g., bouncing and rolling) with a physical golf course terrain in a virtual golf course. A course terrain is a 3D model of the topography of the physical course terrain (e.g., a golf course). A course terrain includes elevation data for a course and can be represented as a 3D digital elevation map (e.g., terrain mesh) of features on the course. The course terrain is used to simulate how virtual objects physically interact with the virtual course and where the virtual objects appear in photographs of the course. Topography data can be collected in a number of ways including, but not limited to, aerial photogrammetric mapping (APM), laser 3D imaging and GPS-real-time kinemetric (GPS-RTK) surveys. As will be described below, the new location of the ball 108 in the virtual golf course is mapped to corresponding 2D location in the photograph 102 a, or a different photograph, so that the ball appears in the proper place and at the proper scale in the photograph as though the ball was actually in the original photograph. In this way, the player is provided the experience of playing on an actual golf course.
  • [0053]
    In various implementations, a visual meter 145 is provided to indicate the amount of backswing that corresponds to the player's input manipulating the club 112. In some implementations, the further the club 112 is pulled back, the more difficult it is for the player to accurately contact the ball 108 with the sweet spot of the club 112. The sweet spot is the portion of the clubface that produces optimum distance and ball flight or does not cause the club to torque or twist to either side when contact is made with the ball. More information on factors affecting the sweet spot can be found in U.S. application Ser. No. 11/407,163, filed Apr. 18, 2006, entitled, “Automatically Adapting Virtual Equipment Model”, which is incorporated by reference herein for all purposes. The optimum club contact timing and location can be indicated by a goal bar 152. Various ranges outside of the goal bar 152 indicate how difficult it will be for the player to make a great shot (area 150), a good shot (area 154) or a poor shot (area 156). The great shot area 150 can correspond to hitting the ball 108 with the sweet spot of the club 112 in a live golf game. A maximum possible shot area can be indicated by a bar 148. As the player increases the backswing, the good and great shot areas 154, 150 can shrink, indicating increasing difficulty in controlling the club 112 as the player increases its backswing. In some implementations, the different areas that indicate difficulty are shown in different colors. In some implementations, the different areas that indicate difficulty are shown with outlines that contrast with the background. In yet other implementations, the difficulty areas are not strictly separate areas, but are shown as gradations, where the locations closest to the goal bar 152 are the better shots and locations further form the goal bar 152 are the worse shots.
  • [0054]
    The player then initiates a downswing motion after the backswing height has been selected. By way of illustration, the player can initiate the downswing motion by either reversing the motion used to cause the golfer avatar 104 to perform its backswing, releasing pressure from a scroll wheel or releasing a button that is held while the user uses the scroll wheel to input the backswing action. The club head location indicator 146 then moves along the meter 145, approaching the goal bar 152. The player selects the quality of the golf swing by selecting either a button or a scroll wheel when the club head location indicator 146 is close to the goal bar 152, for example. How close the player is able to get the club head location indicator 146 to the goal bar 152 when the player makes the selection determines how the club 112 will impact the ball 108. In some implementations, the closer the player is able to get the club head location indicator 146 to the goal bar 152, the straighter the shot and/or the further the ball flies in response. If the player does not provide the input device with input quickly enough and misses the goal bar 152, the club head indicator 146 continues to progressively move farther out into the great area 150, the good area 154 and finally the poor area 156. In some implementations, if the player activates the impact with the ball too soon or too late, the golfer hits the ground with the club, slices or hooks the ball.
  • [0055]
    In various implementations, the player can also select a greater backswing, as indicated by the height of the golfer's club 112. A greater backswing may be used to drive the ball down the fairway. The great area 150 is smaller when the golfer avatar 104 increases its swing as compared to when the player is putting, chipping or pitching the ball, for example. That is, there can be an inverse relationship between the size of the sweet spot and the power of the swing. The downswing and impact are similar to other swings, but with increased difficulty in accurately making impact with the ball. This is described further in U.S. application Ser. No. 11/619,919, entitled “Rotational Game Input”, filed on Jan. 4, 2007, the entire contents of which are hereby incorporated by reference.
  • [0056]
    In some implementations, the manner in which the player uses the input device, such as a scroll wheel, a keyboard or a mouse, affects an aspect of the golfer's swing. For example, when the input device is a scroll wheel device the speed of the player's arcuate input into the scroll wheel device as the player initiates the down swing can affect the golfer's swing, such as by determining in part the speed of the golfer's swing or the distance of the shot. Alternatively, or in addition, the smoothness of the player's tempo of the motion on the scroll wheel can determine how straight the shot is. Hesitation or jerkiness of the player's action can cause the shot to either slice or hook. In some implementations, the player can input that he or she is a right handed player or a left handed player. The type of action that depends on the direction of rotation on the scroll wheel, that is, a clockwise or a counter clockwise direction indicating the backswing, can change depending on the handedness of the player.
  • [0057]
    Various methods of initiating the downswing and impact time can be used in place of or in conjunction with methods described above. In some implementations, after causing the golfer to backswing, the player releases the scroll wheel to start the downswing. In other implementations, the player selects a button or taps the scroll wheel to initiate the downswing.
  • [0058]
    In addition to a scroll wheel device, the user input device may be a mouse, a joystick or buttons. Other user input devices are possible. The movement of the mouse, the length of time the joystick is held in one direction or the length of time that a button is depressed may affect the golfer's swing or the distance of the shot. Additionally, some combination of depressing buttons or moving a mouse may determine the amount of backswing, the moment of impact with the ball, the amount of follow through, or the direction of the ball.
  • [0059]
    FIG. 2A is a flowchart 200 of an example technique for photographic mapping in a simulation such as an electronic game. User input is optionally obtained which causes one or more virtual objects (e.g., golf ball 108) to interact in a virtual course (step 202). Based on a simulation or other means, one or more new locations for a virtual object on or above the course terrain are determined (step 204). For example, a game engine can simulate the physics of a virtual golf ball's trajectory, collision with a virtual tree, and the ball's eventual landing, rolling and resting on the course terrain. In various implementations, the movement of the ball from the time from when the ball is put into play until when the ball comes to rest is represented by a 3D path through the virtual course. When the ball is airborne, the path is above the course terrain and when the ball is in contact with the course terrain, the path lies on the terrain. The path is considered part of the state of the golf game's virtual universe. Positions along the path can be identified for purposes of determining the location of the ball in the virtual course over time.
  • [0060]
    One or more photographic images of the course corresponding to the virtual object's new location(s) on or above the course terrain are identified (step 206). If there is more than one virtual object, a photograph corresponding to an area of the course encompassing the location of all of the virtual objects can be identified. In various implementations, where there are multiple photographs covering a given course location, the photograph that provides the best view from the player's perspective is chosen. For example, a photograph which is closest to centering on the location of the virtual object's new location would be chosen. Alternatively, more than one photograph of the virtual object's new location can be digitally stitched together to form a single, composite photograph. Other techniques for photograph selection are discussed below. The virtual object is then incorporated into the photographic image(s) using a mapping technique that is described below (step 208). The virtual object can be animated in the photograph(s) and appears at the proper location and scale based on the virtual object's location in relation to the course terrain.
  • [0061]
    FIG. 2B is a flowchart 201 of an example technique for pre-fetching photographic images for mapping in a simulation such as an electronic game. Pre-fetching photographic images can improve the responsiveness of interactive applications by locally caching photographs ahead of time before they are needed. This is especially true if images need to be retrieved from remote storage such as a server. One or more potential locations for a virtual object in a virtual course are determined (step 203). In various implementations, the determination can be derived where game play, for example, is expected to proceed based on a user's playing history or on the playing history of a group of users for that particular part of the virtual course. By way of illustration, playing history can include information identifying past locations of virtual objects in the virtual course for the user and measures of the user's playing abilities. Player history can include other information. A photographic image of the course corresponding to each potential location is then identified (step 205). The identified photographs are then pre-fetched (e.g., cached) so as to be ready for possible incorporation into the game play (step 207). The virtual object is then incorporated into one of the obtained images (step 209) based on the new location of the virtual object. In some implementations, the game can obtain all photographs of the terrain corresponding to the next hole of golf.
  • [0062]
    Some or all of a virtual object's movement in the virtual course can be animated in a course photograph. For example, after the player strikes the golf ball 108 in photograph 102 a (as shown in FIGS. 1A), photograph 102 b (as shown in FIGS. 1B) can be presented to the player along with animation of the ball 108 falling from the sky at location 108 a, impacting the golf course at location 108 b, and rolling to resting location 108 c. If the ball 108 were to continue rolling beyond the edge of the photograph 102 b, a new photograph corresponding to the ball 108's new location could be displayed. This can continue until the ball 108 comes to rest. Alternatively, only the last such photograph (i.e., the photograph of the virtual object's resting place) need be presented. Visual representations of other virtual objects can also be animated in the photograph 102 a. For example, the avatar 104 can be animated such that the avatar 104 swings the golf club 112 and reacts to the swing. As another example, a golf flag 106 can be animated such that the golf flag 106 moves in the wind.
  • [0063]
    Additional graphical information to help the player can be incorporated into the photograph and the GUI 100. As shown in FIGS. 1C, a directional aiming arrow 120 is provided to assist the player in setting up a shot. An animated arc 122 can be drawn on the photograph to show the player the path the golf ball 108 will take in the air and on the course. Alternately, the arc 122 can be drawn as the golf ball 108 moves in the photograph 102 c. Two status areas 122 a-b are incorporated into the GUI 100 to provide information such as the current location in the virtual course, the player score, distance to the hole, wind speed and direction, and the virtual club the player is using.
  • [0064]
    In order to systematically photograph an actual course (e.g., a track, a golf course, a baseball diamond, a football field, a tennis court, one or more roadways) for use in electronic game or other application, the course can be manually or automatically divided into a grid of cells. Each cell defines a physical area of the course that will be photographed for use in the simulation. Each cell can have one or more photographs associated with the cell. In various implementations, a cell photograph captures the area of the course corresponding to the area of the cell. FIG. 3A illustrates an example course grid 300. A course can be of any size and shape, and can include non adjacent areas. Likewise, cells can have different sizes, shapes and do not have to be adjacent to one another. Depending on the portion of the course they cover, cell density can change. In various implementations, cell density increases in course areas where players are more likely to interact with virtual objects.
  • [0065]
    In the world of golf, for example, these areas would be putting greens (e.g., 302), tee boxes (e.g., 306 a-d) and hazards such as sand traps (e.g., 304 a-d) and obstructions such as trees that golfers must circumnavigate. In other areas of the course, cell density is decreased meaning fewer course photographs need to be taken. In various implementations, lower density cell areas have a lower frequency of balls landing in them, require a wider area of visibility for the player, or both. In various implementations, automatic image recognition techniques can be used to identify such areas of a course based on recognizing certain visible features (e.g., putting greens, sand traps, trees). By identifying areas of a course as having a high or low probability of player interaction, a course can be automatically divided regions having appropriate cell densities.
  • [0066]
    In various implementations, a course can have more than one layer of cells. The need for this may arise, for instance, to handle situations when a player, due to accident or poor skills, causes a virtual object to be located in a part of the course that rarely sees play. In FIG. 3A, small cell 308 b for tee box 306 a is the cell used by default for a photograph at this stage of the course since most players are able to hit the ball quite a distance up the fairway. However, some players may cause the ball to land in close proximity to the tee box 306 a. The area just outside of the tee box 306 a is not included in the photograph for cell 308 b. However, secondary cell 308 a overlaying the default cell 308 b can be used to obtain a photograph when the ball lies within the bounds of cell 308 a. The photograph for the secondary cell 308 a encompasses the tee box 306 a and the surrounding area. A layer can be chosen based on rules or heuristics that can depend on the state of the virtual universe at a particular point in time. In various implementations, a layer is chosen based on which provides the smallest cell size for the location of a virtual object. In other implementations, a layer can be chosen based on a required style of presentation. For example, it may be desirable to show a ball in flight passing through a cell for dramatic effect.
  • [0067]
    As discussed above, in various implementations each cell in a course grid is photographed such that the photograph encompasses the area of the course in the cell. For example, the photograph shown in FIG. 3C is of a 25′6″×25′6″ cell indicated by the boundary 301. Two avatars (104 a-b) have been rendered in the photograph to illustrate how the scale of virtual objects change based on their position on a course terrain. This is described in more detail below. The photograph is taken by a camera at a specified 3D position (longitude, latitude, and altitude) in the actual course. A camera's 3D position can be determined by use of a Global Positioning System (GPS), radio triangulation, or a ground based navigation system, for example. Since the position of each cell is known, the camera position can be specified as a setback distance from the cell and a height above the cell. In the photograph of FIG. 3C, the camera is positioned 29′6″ away from the cell and a height of 10′3″ above the cell. A 24 mm lens was used for the photograph. FIG. 3D is a photograph of a 10′3″×10′3″ cell where the camera was positioned at a setback of 12′6″ and a height of 5′6″, and using a 18 mm lens.
  • [0068]
    FIG. 3B illustrates an example of how photograph parameters can be derived from a cell in a course grid 300. In various implementations, a camera's position and direction can be determined based on a target position for a given cell. In golf, for example, generally the target will be the hole unless the fairway turns such that players must aim for the turn in order to set up a shot for the hole. In this later case, the target would be the turning point in the fairway. The target for cells 310 a and 310 b is hole 302. A line passes through the center of each cell to the target. It is along this line that the camera lens will point towards the target. The location of the camera on the course will be along the line and outside of the cell. Cell 310 a's camera is located at position 312 a along the line defined by endpoints 312 a and 302. Likewise, cell 310 b's camera is located at position 312 b along the line defined by endpoints 312 b and 302.
  • [0069]
    In various implementations, the focal length of the lens, the angle of the lens, the offset of the camera from the edge of the cell, and the height of the camera can be predefined for a given cell size. In another implementation, one or more of the focal length, the angle of the lens, and the 3D location of the camera can be dynamically determined. By way of illustration, such a determination can take into account the physical terrain that corresponds to the cell. If a given cell was in a valley, for instance, it could be beneficial to provide more of an overhead shot so that a player does not lose perspective with the surrounding course area.
  • [0070]
    FIG. 4 is a flowchart illustrating an example technique 400 for automatically dividing a course into cells and generating a shot list. Since a course can be automatically divided into cells and since camera parameters for each cell can be automatically determined, a so-called shot list can be automatically determined. A shot list is a list of photographs that need to be taken for cells in a given course. Each shot includes a 3D location of the camera, lens focal length, direction, and angle. A course is initially divided into cells as described above (step 402). One or more target points are determined for the course (e.g., 302; step 404). Camera parameters are determined for each cell based on the target point(s) and/or cell size (step 406). Finally, a shot list is generated describing the camera requirements required to photograph each cell on the course (step 408). In a further implementation, the shot list can be downloaded to a robotic device with an attached camera such as a robotic helicopter capable of hovering at precise 3D coordinates. The robotic device can then carry out capturing photographs for one or more of the cells.
  • [0071]
    FIG. 5A is an illustration of an example of a course terrain 501 for a virtual course. Each cell (e.g., 303) maps to a portion of the course terrain 501. In addition to the topography information that the course terrain 501 provides, surface type information can be integrated into the course terrain 501 to further increase the realism of virtual objects' interaction with the course terrain 501 and objects on the course terrain 501. By way of illustration, a ball that lands in the rough tends to lose momentum more quickly than a ball that lands on the green. A ball that hits the cart path, which is a hard surface, such as concrete, tends to bounce more and roll faster than a ball that hits grass. Even the direction of lie of the grass on the green can affect friction that acts on the ball and therefore changes the speed of the ball. Wet grass can decrease the coefficient of friction and cause the ball to slide more than dry grass, but can also increase the springiness of the grass and increase the roll resistance of the grass. A ball that lands in a sand trap loses momentum and tends to roll or slide little. A ball that lands in a water hazard sinks and its post-land movement is irrelevant to the player.
  • [0072]
    Once a ball is hit with a face of a club, the ball has a velocity, direction, spin rate and spin direction. These are described further herein. Hitting the ball either puts the ball into flight or pushes the ball along the ground. The velocity of the ball can range from a maximum of about 75 m/s, which is a drive by a professional golfer, to about 26 m/s at the end of a drive. A put is generally around 1.83 m/s and any balls rolling faster than 1.63 m/s will not be captured by the cup.
  • [0073]
    A rolling model of the ball simulates the behavior of the ball as it rolls across a surface. Rolling begins when the ball approaches the surface from flight, such as within several millimeters of the surface, and the normal component of the ball's velocity is below a particular threshold. When the ball is rolling, the ball is subject to gravity, wind, friction and a normal force from the surface. The ball continues to roll until reaching an equilibrium state, where the velocity and the gravity, wind, friction and normal forces are approximately zero.
  • [0074]
    As a golf ball rolls, rolling friction slows down the angular velocity of the ball. A rolling friction of a golf ball can be between about 0.054 and 0.196 on a green (based on Stimpmeter ratings). Grass on the fairways is at the high end of this range, and the rough and sand traps are even higher. If the grass is wet, the friction can be greater than the same type of dry grass.
  • [0075]
    The coefficient of friction describes how much resistive force is generated by sliding a ball along a surface. A golf ball sliding on a green can have a value of between about 0.25 and 0.50, such as about 0.4.
  • [0076]
    In the course of simulating the golf ball's trajectory, the friction force that results from sliding across a surface can be determined. Sliding friction is a contact force that arises when two surfaces in contact with one another have a non-zero relative velocity. The direction of the friction force is opposite the direction of relative motion, while the magnitude of the force is based on physical properties of the two surfaces involved. The Coulomb model provides a reasonable estimate of the maximum magnitude of the friction force, based on the magnitude of the normal force and an experimentally determined coefficient.
  • [0077]
    Calculating the actual direction and magnitude of the friction force can be more complicated, especially when rotational movement is considered. Angular velocity, or spin, can increase or decrease the relative contact velocity. A rolling object, for example, has a contact velocity of zero, and thus experiences no sliding friction. A rolling object does, however, experience a separate force, called rolling friction, which acts to oppose the object's motion. Rolling friction typically arises from energy losses caused by deformation of one or both of the objects involved. Furthermore, sliding friction usually generates a torque that works towards establishing rolling, in effect canceling itself out.
  • [0078]
    An algorithm for computing the average friction force over a fixed duration for a sphere on a flat surface can account for linear and angular velocity, as well as external linear accelerations, such as gravity. Physical properties of the sphere, like radius, mass and moment of inertia are also incorporated into the result.
  • [0079]
    The algorithm could be considered an extension of the Coulomb model. The algorithm begins by determining how much friction force it takes to start—or maintain—rolling over the given duration. It then limits this quantity by the maximum amount estimated by the Coulomb model.
  • [0080]
    Rolling can be defined as follows. Let νcm be the velocity of the center of mass, ω be the angular velocity, and r be the vector from the center of mass to the contact point. The velocity of the contact point can be determined by νcpcm+(ω×r). If the sphere is rolling, the velocity of the contact point is zero, which implies
  •  ω   =  -      v  c       m       r    .    
  • [0081]
    Next, the force required to start the ball rolling over a particular interval is determined. If νcm, νcp and ω are functions of time (indicated by a subscript), and the time interval is defined as ranging from 0 to t, the following equations can be used:

  • νcp,0cm,0+(ω0 ×r)

  • νcp,1cm,1+(ω1 ×r)={circumflex over (0)}
  • [0082]
    Let x be the total external tangential force. An example of this would be the component of gravitational force parallel to a sloped surface. This represents any external force that affects the relative contact velocity but does not apply a torque to the sphere.
  • [0083]
    Let m be the mass of the sphere, and I be the moment of inertia. If FR is the amount of force that must be applied over time t to ensure the ball is rolling, the following equations can be used to determine the velocities:
  • v   c       m  , t   =   v   c       m  , 0   +   (    F R  + x  m  )   t        ω t  =   ω 0  +    (  r ×  F R   )  I   t    
  • [0084]
    This implies:
  • F R  =   -  (    v   c       m  , 0   +  (   ω 0  × r  )  +   x m   t     (   1 m  +   r 2  I   )   t   )   =  -  (    v  cp , 0   +   x m   t     (   1 m  +   r 2  I   )   t   )      
  • [0085]
    The algorithm then proceeds to calculate the maximum friction based on the Coulomb model, using the normal force FN and an externally defined coefficient of friction μ. The direction of the friction force is given by
  • F R     F R      
  • and the magnitude is given by min(μFN,|FR|).
  • [0086]
    This algorithm can also be used to calculate the frictional impulse JT that occurs during a collision. Similar math yields the following formula:
  • J T  =   -  (    v   c       m  , 0   +  (   ω 0  × r  )    (   1 m  +   r 2  I   )   )   =  -  (   v  cp , 0    (   1 m  +   r 2  I   )   )      
  • [0087]
    When a ball lands from flight, often the ball bounces, in part due to the elasticity of the ball and the hardness or elasticity of the surface. A scalar value that describes the amount of energy lost when the ball bounces on a surface is the coefficient of restitution. Soft surfaces, such as sand, have lower coefficients of restitution than firmer surfaces, such as greens and cart paths. Soft turf can have the following values for the coefficient of restitution

  • e=0.510−0.375ν+0.000903ν2 for ν≦20 ms−1

  • e=0.120 for ν>20 ms−1
  • where ν is the impact speed normal to the surface. See, e.g., Penner, A. R. “The physics of golf: The optimum loft of a driver,” American Journal of Physics 69 (2001): 563-568.
  • [0088]
    An impact parameter is a scalar quantity measured in radians to describe the amount of surface deformation cause by a ball impact. In some implementations, the calculations use a Cartesian coordinate system where the x axis represents the east/west position and the y axis describes the north/south position and the z axis is a height, or up/down position. Thus, vx is the velocity of the ball in the east/west direction and vy is the velocity of the ball in the north/south direction. A rough approximation for the impact parameter can be estimated from the following equation.
  • θ c  =   0.269   (  v 18.6  )    (  φ 0.775  )        where       φ  =   tan  - 1        v x   v y         
  • See, e.g., Penner, A. R. “The run of a golf ball,” Canadian Journal of Physics 80 (2002): 931-940. Softer surfaces, such as sand, have a higher impact parameter than harder surfaces, e.g., cart paths, which experience relatively little deformation and are almost independent of impact speed or impact angle.
  • [0089]
    The virtual ball's flight, rolling, bounce and slide actions can be approximated to estimate the real motion of a ball. The flight can be estimated using the following model, which incorporates the effects of gravity, lift and drag on the ball. Ball flight begins after the ball is struck, such as by a club face, and continues until the ball collides with the ground or an obstacle, such as a tree, golf cart or other object in the landscape. After a collision, the ball can continue in flight if the ball still has upward displacement or velocity. If the ball does not have any upward displacement or velocity, a rolling model is used to determine the ball's movement, instead of the flight model.
  • [0090]
    To determine the ball's flight, the drag force on the ball is calculated. The coefficient of drag, CD, can be determined from equations generated by fitting curves to data collected from live balls (see, e.g., Bearman, P. and Harvey, J. “Golf Ball Aerodynamics,” Aeronautical Quarterly 27 (1976): 112-122.). The velocity is derived from the velocity of the ball after the ball is hit. Rho is the atmosphere density, in kg/m3. The radius r of a golf ball is at least 4.27×10−2 meters.
  • F D  =   1 2    ρ   (  π        r 2   )     C D    v 2    
  • [0091]
    The lift force on the ball is calculated using the following equation. The coefficient of lift, CL, can be determined from can be determined from equations generated by fitting curves to data collected from live balls (see, e.g., Bearman, supra).
  • F L  =   1 2    ρ   (  π        r 2   )     C L    v 2    
  • [0092]
    Optionally, atmospheric conditions, such as wind and air density, are used to modify the ball's flight path. If atmospheric conditions are accounted for, the wind velocity is determined. The wind can be represented as a function of time and position, which returns a vector quantity indicating the direction and speed of the wind. At least three different wind models can be used. A basic wind model varies the wind direction and speed over time, but assumes that the wind is the same everywhere on the course. Because wind speed usually decreases close to the surface of the ground, the wind model can be scaled linearly to 0, which may require using the 3D terrain data for the course. Further, because wind can be shaped by local geographic features, such as hills or valleys, the wind speed and direction can be altered based on the local geographic features. For example, a hill can create a wind shadow. A wind vector can be stored for each point on a hole. A vector field can be implemented by placing an image map over the course terrain for the hole and using the three channels of the image map to represent the components of the wind vector along each axis. The vectors can represent absolute wind vectors or a relative offset from a global wind vector. Each vector field can be tied to a prevailing wind direction. The ball's fluid velocity can be calculated by subtracting the ball velocity and adding the wind velocity. Headwinds increase and tailwinds decrease the apparent fluid velocity.
  • [0093]
    The direction of the lift force is determined by the vector product of the fluid velocity and the ball's axis of rotation.
  • [0094]
    The ball's gravitational force is calculated, using mass times the gravitational acceleration constant of 9.8 m/s2. A golf ball's maximum mass is 45.93 grams, according to USGA rules. The ball's mass is also used to calculate the ball's linear acceleration, where the sum of forces is divided by the ball's mass.
  • [0095]
    In addition to lift and drag, the spinning golf ball is subject to friction with the surrounding atmosphere. This friction applies a torque, which decreases the ball's rate of spin. The flight model uses a coefficient of moment (Cm) to calculate the magnitude of the frictional torque (τ), using the following equation:
  • τ =   -  ρ   (  π        r 2   )      C m     v 2    (  ω   ω    )     (  2       r  )    
  • [0096]
    The coefficient of moment is calculated as a linear function of the spin ratio, which is defined as the ratio of peripheral speed to the fluid speed. This function has a typical constant around 0.009.
  • [0097]
    The resulting spin deceleration is given by:
  • α =  τ I    
  • where I is the moment of inertia.
  • [0098]
    The position of the ball over time, or the trajectory, is determined based on a position, velocity and acceleration of the ball. The movement of the ball can be calculated for each time step, where the time step is between about 0.001 and 0.01 seconds. However, other time steps can be used as required to minimize artifacts and so long as the time steps are not so small as to make the computations overly expensive.
  • [0099]
    If the ball is no longer in flight and begins rolling, the characteristics of the surface are used to determine the friction force on the ball. If the ball is transitioning from flight to rolling and there is a bounce during the transition, a bounce model is used to simulate the interaction of the ball with the surface on which the ball bounces. The bounce model uses the properties of both linear and angular momentum and friction to determine new values for linear and angular velocity of the ball and is described below.
  • [0100]
    The bounce model simulates the interaction of the golf ball with the surface of the course. It uses the properties of conservation of momentum (linear and angular) and friction to determine new values for the linear and angular velocity of the ball.



Last edited by Paul on Wed 30 Mar 2022, 11:42 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Electronic game utilizing photographs PART 2

Post by Paul Sun 27 Mar 2022, 10:50 pm

  • 0101]
    The bounce model, and particularly the concept of an impact parameter, is based on the model described in Penner, A. R. “The run of a golf ball,” Canadian Journal of Physics 80 (2002) 931:940. The model is extended to three dimensions and modified to support an optional shear parameter for the surface.
  • [0102]
    The bounce model is parameterized by the surface description and surface normal at the point of contact, as well as the physical properties of the ball.
  • [0103]
    The bounce model begins by calculating the amount of surface deformation caused by the ball's impact. The degree of deformation is estimated by an angular impact parameter, which is based on the impact speed and angle of the ball. The bounce model uses the impact parameter to determine the impact normal Ni, which is the effective surface normal after deformation. The impact normal is calculated by rotating the surface normal towards the inverse of the impact velocity direction. To match physical intuition and to prevent artifacts, the impact normal should not rotate beyond the inverse of the impact velocity direction.
  • [0104]
    In some embodiments, the impact parameter uses a simple linear approximation based on the impact speed, but more complicated equations could be used to represent different surface types. In particular, a quadratic equation of impact speed may represent surface deformation more accurately, since the amount of surface deformation is likely proportional to the kinetic energy of the ball. However, the simple linear approximation can be sufficient to represent a realistic action taken by the ball.
  • [0105]
    Using the impact normal, the bounce model calculates the normal and tangential components of the impact velocity. The normal component of impact velocity is used as a parameter in the calculation of the coefficient of restitution for the surface (e). The coefficient of restitution is used to calculate the normal impulse: JN=(1+c)mνi,n. The contact point is also computed (r=−rbN1), where rb is the radius of the ball.
  • [0106]
    The bounce model provides two separate mechanisms for calculating the tangent impulse. If the surface defines a shear parameter s the tangent impulse is calculated as JT=−smν1. The shear parameter is used to simulate soft, deformable surfaces like sand and water. Otherwise, the tangent impulse is calculated using the algorithm described above with respect to sliding friction.
  • [0107]
    The rebound velocity (νr) is calculated using the equation mνr=mνi+JN+JT. The rebound spin (ωr) is calculated using the equation
  • I        ω r   =   I        ω i   +   (  r ×   J T  m   )  .    
  • [0108]
    Upon exiting from the bounce model, the simulation can enter either the rolling or flying state. The next state is chosen based on the predicted maximum height of the next bounce, which is given by the following formula:
  • h =   1 2      (   v r  · n  )  2  g    
  • where h is the predicted height, νr is the rebound velocity, n is the surface normal, and g is the gravitational acceleration constant. If the predicted bounce height is above a threshold value, the ball continues flying. Otherwise, the ball begins rolling.
  • [0109]
    The rolling model described herein can be calculated by calculating the rolling normal. This is a combination of the surface normal at the point beneath the ball and a sampled normal using the terrain elevations around the ball. The sampled normal is calculated by determining two sample points based on the horizontal velocity of the ball. The elevations of these two points, along with the elevation of the point beneath ball, define a plane. The slope of this plane provides an estimate of the normal for the larger region and implements a rough low pass Filter on the terrain normal. By using the horizontal speed of the ball to scale the distance of the sample points, the frequency of the low pass filter can be increased as the ball slows down, implementing a basic adaptive filter.
  • [0110]
    The rolling model next checks whether the ball is below the surface of the terrain. If so, it assumes a previous roll calculation underestimated by the slope. The ball is moved above the terrain, any component in the direction of the rolling normal is canceled, and the kinetic energy is decreased by the amount of potential energy gained.
  • [0111]
    The next step of the rolling model is to calculate the forces and torques acting on the ball. The total force can be divided into the following components: gravity, rolling friction, and sliding friction. Gravitational force is directed downward with a magnitude of mg. Rolling friction is directed opposite the sum of ball velocity and the tangential gravitational force, with magnitude equal to μrFn, where μr is the coefficient of rolling friction for the surface and Fn is the normal force.
  • [0112]
    Sliding friction force is calculated as described above, with tangential gravitational force and rolling friction as the external forces. The total torque is determined by taking the cross product of the contact vector and the sliding friction force. Total friction and total torque are passed to the integrator, which calculates the position, velocity and spin at the next time step.
  • [0113]
    In various implementations, golf ball rolling behavior across a sloped surface can be modeled using existing techniques (see, e.g., Penner, A. R. “The run of a golf ball,” Canadian Journal of Physics 80 (2002): 931-940).
  • [0114]
    In addition to the bounce model and roll model, the ball movement during flight and after coming into contact with the cup and pin can be determined.
  • [0115]
    The flight model simulates the effects of gravity, lift and drag on the ball. The flight model begins after the ball is struck by the club and continues until the ball collides with the ground or another obstacle. After a collision, the flight model continues if the ball has a significant upward displacement or velocity; otherwise, it transitions to the rolling model. Note: it may also be necessary to transition back into the flight model from the rolling model. This could happen if the ball rolled off a drop-off, or rolled up a ramp with sufficiently high velocity.
  • [0116]
    The cup model can be used to determine how the ball reacts when the ball reaches the hole. The cup model assumes that the cup is vertically aligned with the world z-axis. It also disregards the effect of any surface tilt of the green around the rim. The cup model assumes the cup has a diameter of 4.25 inches and a depth of 7 inches. The pin, if present, is assumed to have a diameter of 0.75 inches. Optionally, these measurements can be changed. Because the cup model represents a small, but important, portion of the trajectory, the time step for the cup model can be reduced, such as by a factor of ten, to reduce errors in the simulation.
  • [0117]
    The cup model begins by calculating the displacement of the center of the ball relative to the center of the cup, in both Cartesian and cylindrical coordinates. Using the cylindrical coordinate theta, it also computes radial and tangential direction vectors. The radial direction is the direction outward from the center of the cup to the point on the cup's wall or rim closest to the ball. Using these vectors, the cup model determines the radial and tangential components of the ball's velocity. If the ball is above the rim of the cup, that is, if the elevation greater than zero, the cup model also calculates the position of the point on the rim closest to the ball, the direction from this point to the center of the ball, and the distance from this point to the center of the ball.
  • [0118]
    Based on the ball's position and velocity, the subsequent behavior of the ball is categorized. These categories are implemented as internal states of the cup model. The states are ball colliding with bottom of the cup, ball colliding with the pin, ball colliding with the wall of the cup, ball is rolling or sliding along wall of cup, ball is colliding with the rim, ball is sliding or rolling along rim and ball is falling freely. These states are each described.
  • [0119]
    This ball is colliding with the bottom of the cup when the ball's elevation minus the ball radius is less than or equal to the cup depth and the vertical component of the ball's velocity is less than zero. This state invokes the bounce model, using the surface description of the cup and the unit-z vector as the normal.
  • [0120]
    This ball collides with the pin if the pin is present, the ball's radial position minus the ball radius is less than the pin radius, and the radial component of the ball's velocity is less than zero. This state also invokes the bounce model, using the surface description of the pin and the radical direction as the normal.
  • [0121]
    The ball colliding with the wall of the cup state and the ball is rolling or sliding along wall of cup state occur when the ball is below the rim of the cup, that is, the ball elevation is less than zero, and the ball is contacting the wall of the cup, that is, the ball's radial position plus ball radius is greater than the cup radius.
  • [0122]
    This ball is colliding with wall of cup when the radial component of the ball's velocity is greater than zero. This state invokes the bounce model, using the surface description of the cup and the negative radial direction as the normal.
  • [0123]
    This ball is rolling or sliding along wall of cup when the radial velocity of the ball is less than or equal to zero. In this state, the cup model calculates the total force and torque on the ball and passes both to the integrator, which determines the position, velocity and spin at the next time step. The total force has three components: gravitational force, normal force from the wall of the cup, and friction force. The total torque is determined by the friction force alone, as both the gravitational and normal forces are directed through the center of mass of the ball.
  • [0124]
    As described herein, the magnitude of the gravitational force is calculated by multiplying the mass of the ball by the gravitational acceleration constant (9.81 meters per second squared). The direction of the force is straight down. Because the cup is assumed to be vertical with respect to the ground, all of the force is tangential to the wall of the cup.
  • [0125]
    The normal force keeps the ball from penetrating the wall of the cup. The normal force can be calculated by observing that the normal force is also a centripetal force which causes the center of the ball to travel in a circular path having a radius equal to the cup radius minus the ball radius. The magnitude of a centripetal force is computed by dividing the square of the tangential velocity by the radius of the circular path, while the direction is inward toward the center of the circle. The friction force is calculated using the algorithm described above with respect to sliding friction, with the tangential gravitational force used as an external force.
  • [0126]
    The ball collides with the rim and the ball is rolling or sliding along the rim when the ball is above the rim of the cup, that is, the ball's elevation is greater than or equal to zero, and the ball is in contact with the rim, that is, the distance from the rim to the center of the ball is less than the ball's radius.
  • [0127]
    The ball collides with the rim when the dot product of the ball's velocity and the rim direction is less than zero. The state invokes the bounce model, using the surface description of the cup and the rim direction as the normal.
  • [0128]
    This ball is rolling or sliding along the rim when the dot product of the ball's velocity and the rim direction is greater than or equal to zero. In this state, the cup model calculates the total force and torque on the ball and passes both to the integrator. The total force is composed of the gravitational force and frictional force. The forces are split into a normal component, that is, a component aligned with the vector from rim to ball center, and a tangential component, which is defined by the cross product of the tangent vector and the vector from the rim to the ball center. The friction force is calculated as described above with respect to sliding friction, with the tangential gravitational force and centrifugal force as external forces.
  • [0129]
    The ball falling freely is the default state, selected when none of the prerequisites for the other states have been met. In this state, the ball is not in contact with the cup or the pin. The total force on the ball is equal to the gravitational force.
  • [0130]
    The cup model ends when the ball escapes or exits from the cup or is permanently trapped. Escape from the cup detected when the radial displacement of the center of the ball is greater than the radius of the cup. If both the elevation and vertical velocity of the ball are small, the simulation transitions into the rolling state; otherwise, the simulation transitions into the flying state.
  • [0131]
    The ball is considered permanently trapped when it is no longer energetic enough to escaped the cup. The vertical potential energy of the ball is given by the product of ball mass, gravitational acceleration constant and elevation. Using this formulation, the potential energy is negative when the ball is below the rim of the cup. The vertical kinetic energy of the ball is given by half the product of ball mass and the square of the ball's vertical velocity. If the sum of vertical kinetic and potential energy is less than zero, the ball is permanently trapped.
  • [0132]
    The energy-test for entrapment relies on the assumption that the cup model can only decrease vertical kinetic energy. For the most part, this is true. The only exception to this assumption is the potential to convert angular momentum into vertical velocity via contact with the cup wall. This conversion, while possible, is assumed to be negligible. Holmes, B. “Putting: How a golf ball and hole interact,” American Journal of Physics 59 (1991): 129-136 provides a good view of the physics involved when a golf ball rolls into the hole, and Penner, A. R. “The physics of putting,” Canadian Journal of Physics 80 (2002): 83-96 includes a correction for sloped greens. In various implementations, the game engine 725 (described below) implements the above described models as described in both papers.
  • [0133]
    Like the course terrain that virtual objects interact with, additional features, such as surface characteristics of the physical terrain, can be used in the calculation of a virtual object's movement when in contact with the course terrain and when colliding with objects on the course terrain. These features can be used in the equations above to determine the virtual object's direction, speed, spin and acceleration as the virtual object interacts with the model of the physical terrain.
  • [0134]
    Referring to FIGS. 5B1 and 5B2, a photograph can be divided into general surface types to form a surface type map. The surface types can be bounded by lines drawn to delineate the parts of the hole or by using edge detection techniques on the photograph. The surface type map can itself be mapped onto the portion of the course terrain to which it corresponds. In this way, surface type information can be integrated into the course terrain information. Alternatively, surface types can be directly identified on the course terrain itself.
  • [0135]
    In the example surface type map, a golf cart path 504, a sand trap 506, a green 508, a fairway 510, rough 512 and a pin 514 are each provided with a different surface characteristic. As noted, even though the green 508, fairway 510 and rough 512 are each formed of grass, the ball interacts with each type of grass differently. Specifically, each surface type can have a unique restitution, static friction, kinetic friction, rolling friction and a unique impact parameter. When the roll, bounce and slide of the ball are calculated, the coordinates of the ball's location are matched with the surface type assigned to the coordinates. Of course, each part of the hole can be broken up into further subgroups of surface types, as desired.
  • [0136]
    In some implementations, a photograph is used as a template to create a surface type map. Alternative implementations allow assigning surface characteristics to the course terrain directly. The photograph has real world surfaces, such as grass, concrete, water, and sand, which are specified in the photograph, such as by drawing a border around each real world object or around groups of real world objects in the photograph (step 560). In some implementations, the real world object delineators are polygons, shapes with curves or other shapes that are drawn over a corresponding surface. Each shape can be filled with a color or pattern, where each color or pattern corresponds with a specified surface characteristic, such as friction and impact parameter values. (Other ways of associating a shape with a surface type are possible.) That is, the real world objects in a given photograph are assigned a surface type (step 562). The surface characteristics are then mapped to corresponding areas of the course terrain so that they can be used in calculating virtual object's response to interaction with the course terrain.
  • [0137]
    In addition to providing the surface types, real world objects in the photograph can be assigned a collision property that affects how a virtual objects reacts when they collides with the real world objects in relation to the course terrain. In some implementations, the collision property is used in two steps of the virtual object trajectory determination process, collision detection and collision response. Whether the virtual object will collide with an object is determined by comparing the trajectory of the ball with any objects in the course terrain that have a collision property assigned to them. If an imminent collision is detected, the ball is moved just prior to the point of collision. In some implementations, the collision response then adjusts the ball's velocity and direction according to the response's parameters and simulation of the ball movement continues.
  • [0138]
    By way of illustration, two example techniques of marking a photographic image with collision information are described. One technique is referred to herein as a camera image method and it provides pixel accurate collision with a photographic image. The camera image collision method can be used with foreground objects that are perpendicular with the camera and require accurate collision. If the ball appears to move through a collide-able object, such as a tree, in the camera image a collision occurs. This technique involves painting objects in the photographic image in unique colors and adding information to an instruction file, such as an Extensible Markup Language (XML) file, that associates the colors to locations and collision responses. The instruction file and the photographic image can be merged, such as to generate a .png file, to enhance the course terrain, that can be loaded at runtime.
  • [0139]
    Referring to FIGS. 5C1 and 5C2, the real world objects in the photographic image that can be assigned collision properties are identified (step 564). In one photograph, three palm trees 518 in the foreground are good candidates for camera image collision because they are perpendicular to the camera. The trunks 520 of the trees and the fronds 522 are identified as separate objects so that the trunks 520 provide a different collision responses from the fronds 522. The trunks 520 can be given a hard surface collision response, which causes bounce, and the fronds 522 can be given a soft surface collision response, which causes deflection and energy loss. In some implementations, the center of the fronds stop the ball and cause the ball to fall along a random vector and the tips of the fronds deflect the ball and dampen its speed. Therefore, the location of the ball's collision with a soft object, like tree fronds, can affect how the object changes the ball's trajectory or speed. The real world objects are assigned the desired collision property as described further below (step 566).
  • [0140]
    Referring to FIG. 5D, in some implementations, the identified objects can be painted into a collision image. Each object can be given a unique color for matching to data in the instruction file. The colors can be shared with all of the photographic images of a hole. Thus, colors are not reused in other collision images for the hole, unless the color is assigned to a different view of the same object. The palm fronds 522 are each given a similar, but different color as are each of the three tree trunks 520. The collision image is saved in a format, such as Graphics Interchange Format (GIF), which stores accurate colors. Other formats are possible, however.
  • [0141]
    After identifying the real world objects in the photograph, entries corresponding to the objects are added to the information file to identify the position of the object in the course terrain and the collision response assigned to the object. By way of illustration, an example entry can take the form of a tuple: . The responseId can tie the object to a collision response type defined in the information file. The color is a color in a collision image that corresponds to the object and is expressed using a hexadecimal RGB value. In some implementations, xPos, yPos and zPos are the coordinates of the real world object in the course terrain as determined by automatic analysis of the photograph or through other means. The z position is the altitude at the x and y position. The xPos, yPos and zPos can be determined by locating the object in a top-down view, for example. The position selected can be at the approximate center of the object. These values are used in combination with the camera information to determine the depth of the object in the camera view. The depth calculated for this position can be used for the entire object.
  • [0142]
    Below is the object definition for the three tree trunks and three sets of fronds in the example information file.
  • yPos=“550.65” zPos=“10.392”/>
    yPos=“573” zPos=“11.9607”/>
    yPos=“589” zPos=“11.9607”/>
    yPos=“550.65” zPos=“10.392”/>
    yPos=“573” zPos=“11.9607”/>
    yPos=“589” zPos=“11.9607”/>

  • [0143]
    In some implementations, a designer determines which objects are assigned a collision property and assigns the collision property. In some implementations, the system automatically determines which objects should have a collision property without designer input. The system can use a learning algorithm to learn the structure of the golf course from other photographs that have already been assigned collision information. A system that uses a similar learning algorithm to determine vertical structures, sky and ground in photographs is fotowoosh™, at http://www.fotowoosh.com/index.html.
  • [0144]
    FIGS. 5E and 5F show the difference between an example collision response for the tree fronds and an example collision response for a tree trunk. A collision with the fronds causes the ball to loose momentum and deflect a slight amount, then fall to the ground, the trajectory 524 indicating the virtual ball's movement through the image. A trajectory 526 for a virtual ball that strikes the trunk 520 shows the ball bouncing off the tree trunk 520.
  • [0145]
    The camera image collision method is useful for objects that need accurate collision representation to maintain believability. Photographic images are 2-D representations, and like movie props or billboards, they have no additional depth information beyond that calculated from their x, y and z positions. This makes them good choices for objects that are perpendicular to the camera.
  • [0146]
    The collision layer technique uses an aerial view of the course to show objects at specific positions. The collision layer technique can include painting the real world object's locations in a collision layer. Because the top-down view provides the x and y position, the only additional data necessary is the height of the object and for the collision response to be identified. In some implementations, the height is combined with the course terrain elevation information to create a volumetric object. For example, if a square is painted on the collision layer over a flat area of the course terrain (e.g., a height map) and a color is assigned that indicates a height of three feet, the result would be a three foot tall cube sitting on the height map at the location painted. If the object is on a bumpy area of the height map, the object is roughly cubic but the top surface is bumpy, to match the terrain beneath.
  • [0147]
    FIGS. 5G and 5H show example steps in creating a collision layer. The objects to be added to the collision layer area identified, here bushes 530 and ground cover 532. Objects that significantly vary in width from the top to the bottom are not good candidates for the collision layer, because the width is calculated from a single top down view. A bush that is roughly cylindrical is a good candidate, but a tree with a thin trunk and a large bushy top is not. Objects grouped together should also have a uniform height. Collision discrepancies can be more visible on objects with hard collision responses than soft.
  • [0148]
    In the photograph, the bushes are roughly three feet tall and the ground cover is roughly I foot tall. Because the bushes are roughly the same height and have roughly the same collision response, they are each painted the same color and can be handled with the same object definition. The collision layer objects and the camera image objects do not share the same color palette. The collision layer can be exported as a GIF file and can be added to the layer definitions in the information file, for example using the following definition.
  • url=“courses/SkillChallenge/
    SC_BHGC_H06_C01/BHGC_H06_Collision.gif”/>

  • [0149]
    Once the layer has been created and added to the information file, collision objects can be added for each color in the collision layer. An example collision layer object follows:

  • [0150]
    The responseId and color indicate the same things in the collision layer as the collision image. The height indicates the height of the object above the course terrain. Example bush 530 and ground cover 532 definitions are below.

  • [0151]
    The bushes 530 and ground cover 532 cause the ball to react the same way, because both sets of vegetation deflect a real ball in similar ways. If the response for ground cover 532 is to be different, e.g., if the ball is to stop and the shot declared out of bounds, a new collision response can be created and assigned to the ground cover 532.
  • [0152]
    At least three different types of collision responses can be provided, hard object collision response, soft object collision response and collisions with artificial boundaries or a boundary collision response. The hard object response is for hard objects, such as tree trunks, rock walls and benches. The parameters can include the ability to set the surface's normal, vary the normal, for example, when a bumpy surface is to be simulated, and to set the amount of energy lost from the collision. The soft object collision response can be used with leafy portions of trees, bushes and ground cover. The parameters can include the ability to set a range of deflection angles as well as the amount of energy lost from the collision. The third response can be used to designate an area on the map that terminates the ball's flight and, optionally, returns the ball to an overridden surface type, such as when the ball goes out of bounds and play of the ball continues from the closest location in bounds.
  • [0153]
    The hard surface collision response is used to define solid objects. When the ball hits a hard surface, the ball bounces. The attributes of the collision response indicate how the ball bounces. To determine which direction the ball will bounce, the direction the ball is traveling and the normal of the surface with which it will collide are determined. The normal represents the direction the surface is facing and can be calculated in various ways.
  • [0154]
    The camera image collision calculates normals algorithmically based on the camera parameters and thus the collision response does not need to include one. If the collision response does not include an entry for the normal, it is ignored. Below is a typical hard surface collision response entry with an elastic-surface used for a camera image collision.

  • [0155]
    Collision layer objects can have their normal expressed either by specifying the normal directly or by specifying a position on the course which will be used to calculate the normal. A hard surface collision response used to represent a smooth wall which is facing down the x-axis on the course can be expressed as
  • normalX=“1” normalY=“0” NormalZ=“0”/>.

  • [0156]
    A position on the course which will be used to calculate the normal can be specified for curved surfaces. The normal is calculated by drawing a ray from the collision impact position to the position specified. Below is a hard surface collision response which uses a normal position:
  • normalXPos=“133” normalYPos=“1100”
    normalZPos=“0”/>

  • [0157]
    Once a normal has been calculated, a noise factor can be applied to simulate a bumpy surface. This is accomplished by providing a rotational range which is used to vary the normal. The range is expressed in degrees and a value is algorithmically chosen between ± some predetermined value. Below is a hard surface collision response used to represent a wall which is facing down the x-axis but is made of bumpy rocks that will distort the normal by up to ±5° horizontally and vertically.
  • normalY=“0” normalZ=“0” normalVar=“5”/>

  • [0158]
    The hard response attributes above are used as follows. The id is the identification of the collisionResponse. The restitution is the amount of velocity reflected by the surface. A value of one indicates no loss of velocity. A value of zero indicates all velocity is lost. The normalX, normalY and normalZ indicates the x, y and z, respectfully, component of the surface's collision normal. The normalXPos, normalYPos and normalZPos, are the real world x, y, and z positions, respectfully, used to calculate the object's normal and can be expressed in feet or other suitable unit. The normal and normal position are not both specified for the same collision response. The normalVar specifies an angular variation to be used to distort the normal and it expressed in degrees.
  • [0159]
    Soft surface collision responses are used to simulate impacts with surfaces which are not hard enough to cause the ball to bounce but can have some effect on the ball's velocity and direction. Below is an example soft surface collision response used to simulate impact with palm tree fronds. The ball is deflected by ±10° on the horizontal axis (heading) and ±5° on the vertical axis (pitch). In addition, the ball's velocity is reduced by 10% ±5%.
  • speedReduction=“10” speedReductionVar=“5”/>

  • [0160]
    The headingVar is a variable rotation range used to modify the ball's horizontal velocity, expressed in degrees. The pitchVar is a variable rotation range used to modify the ball's vertical velocity, expressed in degrees. The speedReduction is a fixed value used to reduce the ball's speed expressed as a percentage. The speedReductionVar is a variable range used to reduce the ball's speed expressed as a percentage.



Last edited by Paul on Wed 30 Mar 2022, 11:41 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Electronic game utilizing photographs PART 3

Post by Paul Sun 27 Mar 2022, 10:54 pm

0161]

  • The boundary collision response is used to immediately stop the ball and end the trajectory calculation. The final ball position will be at the point the ball intersects an object with the boundary collision response. The final resting location (lie) of the ball will be read from the boundary collision's surface name attribute. Although a similar affect can be accomplished using the surface map, the boundary collision method has one key distinction, it can affect a ball in flight. The surface map is painted on top of the terrain and has no associated height information beyond the height derived from the height map. Therefore, the only time the ball is affected by the surface map is when it bounces or rolls on the terrain.
  • [0162]
    A boundary collision response, however, can be tied to a layer object or camera image. Both object types sit on top of the terrain and extend upwards. Therefore, it is possible for the collision layer objects and camera image objects to interact with a ball while that ball is in flight and adding an object to the collision layer and associating a boundary response with the object allows for stopping the ball in flight or before the ball hits an real world object.
  • [0163]
    Boundary responses can also be used to help handle balls that fly beyond the range of the surface map. Any ball that bounces or rolls beyond the edge of the surface map is automatically treated as out of bounds. While this is a good default behavior, it may occasionally generate unwanted results. For example on an ocean course, where the ocean extends to the edge of the surface map, a ball that bounces on that edge would return a final lie of water. However, a ball that went beyond the edge of the surface map would return out of bounds. This would not be desirable because from a player's perspective, it would look like the ball hit the water and they would expect the final lie of the ball to be in the water. To solve this, a tall layer object can be created on the edge of the height map and given a boundary collision response with a surface name of “Water.” When the ball impacts the layer object, it stops. Since, the ball would not continue off of the surface map, it would not be treated as being out of bounds. Instead, its final lie would be derived from the boundary response—in this case in the water.
  • [0164]
    Below is an example of a boundary collision that acts as an out of bounds area.


  • The surfaceName is the surface type that is reported as the ball's final resting position.
  • [0165]
    Another piece of information that can be added to a photographic image is the relative distance of various real world objects in the photograph. The actual distances can be seen in an aerial photograph of the course. However, to add the perception of depth to the game, masking can be applied that indicates which objects are closer to the camera and which are further. Additionally, whether the ball would be visible in the camera's line of sight can be determined.
  • [0166]
    In some implementations, a designer determines which objects are closer to the camera than other objects and adds the information to the photograph or a layer that is added to the photograph by hand. In some implementations, the system determines which objects are in the foreground. The system can use a learning algorithm to learn the structure and layout of the golf course from other photographs that have already been assigned masking information to indicate hierarchical layers of objects. A system that uses a similar learning algorithm to determine vertical structures, sky and ground is fotowoosh™, at http://www.fotowoosh.com/index.html.
  • [0167]
    FIG. 5I is a photograph of an example golf hole with trees. The photograph includes a stand of trees along a ridge on both the right side 503 a and the left side 503 b of the photographic image. In the real world, the ball would not be visible when the ball is at the same height as the trees (along the z axis) and the trees are between the camera and the ball. The ball would also be hidden if the ball were over the ridge. In the virtual world, the trees can be outlined and each outlined area assigned a distance value. Therefore, if the ball is along a vector running from the camera through one of the trees, the ball's visibility can be based on whether the tree is between the ball and the camera or behind the ball.
  • [0168]
    FIG. 5J is an example representation of the trees in the photograph. The representation includes stencils or silhouettes of trees. The trees that are close 542 to the camera overlap the trees that are further 544 from the camera in a two dimensional photograph. In some implementations, the stencils are drawn down to the exact pixel shape of each tree. Bitmap masking can be used, which gives each tree, or other object that is being masked, a single bit depth, which is then given a three dimensional depth property.
  • [0169]
    FIG. 5K is an example representation of a ball 546 between the camera and the trees that are close 542 to the camera. Because the ball 546 is in front of the trees, the ball 546 remains visible. FIG. 5L shows the ball 546 going beyond a tree that is close 542 to the camera, but falling between the tree that is close 542 to the camera and a tree that is further 544 from the camera. Thus, the ball disappears behind the closer tree and reappears in front of the tree that is further away 544 when no longer covered by the closer tree 542. Even though the masking does not actually indicate a depth for each tree, multiple layers of trees can provide the illusion of depth.
  • [0170]
    FIG. 5M illustrates another instance when the ball is not in the image. If the terrain has any features, such as hills, that are between the ball and the camera, the ball disappears from view. That means that any obstruction causes the ball to be not visible. If the trajectory 552 of the ball is such that the ball can be seen during at least part of its flight path, but it lands over a ridge 550 or hill, the ball will not be displayed in its landing spot without first changing the image to one where the ball is visible. For example, if the camera angle is not such that the interior of the hole can be seen, the ball disappears as it falls into the hole.
  • [0171]
    FIG. 5N is a flowchart illustrating how a virtual object can be displayed during play. The photograph that is to be displayed is received (step 570). The receipt of the photograph, such as by a client or other computing system, is described further herein. The photograph is associated with a first discrete shape that is aligned with the real world image in the photographic image. The discrete shape or shapes in have distance values assigned to them. The virtual object is displayed moving in or through the photograph (step 572). The virtual object's trajectory overlaps with the discrete shape when the trajectory's horizontal and vertical coordinates are the same as the discrete shape's horizontal and vertical coordinates. If the trajectory overlaps with a discrete shape associated with the photograph, then whether the trajectory has a distance value greater or lesser than the discrete shape is determined. If the virtual object is along a part of the trajectory that overlaps with the discrete shape and the trajectory has a distance value that is greater than the distance value of the discrete shape, the virtual object disappears or is made to look as if the discrete object obscures the virtual object during the overlap (step 574).
  • [0172]
    Referring to FIG. 5O, any of the attributes that are assigned to the real world objects in the photographic image can be used to determine the virtual objects movement in relation to, and interaction with, the course terrain. A user provides input indicating how the user wants to control a virtual object, such as an avatar or a ball. The signal that indicates the user input is received (step 580). The movement of the virtual object in relation to the course terrain is determined (step 582). The movement can be based on the user input that is received. The movement is further based on whether the virtual object will collide with a real world object. If the virtual object collides with the real world object, the virtual object's path of movement is changed accordingly to cause the movement to include a collision response. If the determination is made by a server or computer system different from the computer system or client being used by the user, the movement of the virtual object as it has been determined is transmitted to the remote receiver (step 584).
  • [0173]
    Referring to FIG. 5P, an example method of representing the movement of a virtual object, e.g., a ball, can include showing the interaction of the virtual object with surfaces in photographs. The photograph that is to be presented to the user is received (step 590). A trajectory for the ball moving over and across, or through, the photograph is also received (step 592). The trajectory includes the ball's movement before and after the ball collides with a real world object in the image. If the ball collides with a surface or object, the trajectory includes a change in path that reflects the collision response. The ball is represented moving in the photograph, where the representation is 2D representation (step 594).
  • [0174]
    FIG. 6A is a flowchart illustrating an example technique for incorporating a visual representation of virtual objects into a photograph. As described above, a game or simulation engine determines the location of a virtual object in virtual course in relation to the course's terrain. A course terrain area in which the virtual object is located is identified (step 602). Next, the camera that took the photograph for the cell covering the terrain area is simulated (step 604). As shown in FIG. 6B, a virtual camera 603 simulates the exact view 605 of the actual camera based on known parameters of the camera (e.g., the 3D position of the camera, the angle and direction lens, and the focal length of the lens). Using a 3D perspective projection, the virtual object(s) (e.g., ball 108) in the 3D virtual course space are projected into the 2D viewing plane 605 of the simulated camera 603 (step 606). A perspective projection ensures that virtual objects that are farther from the virtual camera will appear smaller in relation those that are closer to the virtual camera, thus adding to the sense of realism. In various implementations, the projection can compensate for visual distortion in the camera lens. The virtual objects in the 2D projection are then incorporated into the actual photograph of the cell (e.g., 102 b; step 608). This can be repeated for the same photograph to create an animation of the virtual object. Additional virtual objects (e.g., avatars, virtual equipment) can also be dynamically included to the projection even though the position of these objects may not being used to trigger photographic mapping.
  • [0175]
    The functionality of a system that incorporates virtual objects into photographs can be segmented into logical components that operate on the same computing device or multiple computing devices connected by one or more networks or other suitable communication means such as shared memory, for instance. A computing device can be a personal computer, server computer, portable computer, cellular telephone, smart phone (e.g., Blackberry), digital media player (e.g., Apple iPod) or other device.
  • [0176]
    Various implementations exploit an example client/server architecture for the functional components, as shown in FIG. 7A. In this architecture, a server 704 includes functionality for modeling the movement of virtual objects in a virtual course through simulation or other means where as a client 702 includes a GUI (e.g., 100) for obtaining user input, presenting 2D photographs that incorporate visual representations of virtual objects, and enabling user interaction with the photographs. The server 704 utilizes local or remote storage 708 for game assets such as course photographs, course terrain data, game parameters, game state, and other information and provides a subset of this information to the client 702 as needed. In some implementations, the client 702 can obtain needed information from other sources besides the server 704 such as, for instance, content servers or network accessible caches. The client 702 utilizes local or remote storage 706 for caching photographs, course terrain data, and other information received from the server 704.
  • [0177]
    By way of illustration, a user can provide input such as a golf swing to the client 702's GUI which results in the client 702 sending a signal to the server 704. The communication between the client 702 and the server 704 can be based on a public protocol such as Hypertext Transfer Protocol (HTTP) or a proprietary protocol. In response, the server 704 performs a simulation or other process to determine the path of the virtual ball through the virtual course and returns to the client 702 the path, a set of course photographs (if not already obtained by the client 702) that capture the ball's path, and any other information that may be needed by the client 702. The client 702 then incorporates animation of the ball traveling through the photographs based on the ball's path through virtual course.
  • [0178]
    FIG. 7B is a diagram of an example architecture where multiple clients share a server. In this architecture, the server 704 is able to service a plurality of clients 702 a-d. This is possible assuming the server 704's computing resources can accommodate the added computational load of additional clients. This architecture also requires that the server 704 maintain game state and other resources on a per client basis. This architecture allows the clients 702 a-d to play in the same virtual course, if desired, and allows for other multiplayer features such team forming and competitions between players and teams.
  • [0179]
    FIG. 7C is a diagram of an example server farm architecture which extends the architecture of FIG. 7B by allowing for multiple servers. A server farm 714 is a cluster or collection of networked server processes running on multiple computing devices. A server process in the farm 714 can service more than one client. When a client 702 a-c needs to utilize a server, the client's request is routed to a server proxy 710 instead of an individual server. The server proxy 710 determines which server in the farm is least busy, for example, and assigns the client request to that server (e.g., 712). From that point on, the client can communicate directly with the selected server or the proxy can treat each subsequent request from the client as it did the first request. Server farms also allow for dynamic load balancing. For example, if the performance of server 712 deteriorates due to load, for example, the server 712 or the proxy 710 can move any requests currently pending on the server 712 to a less burdened server. This can occur without the client's knowledge. In some implementations, multiple servers in the farm 714 can cooperate to service a single client request by partitioning computing tasks among them.
  • [0180]
    FIG. 7D is a schematic diagram of an example client 702. The client 702 includes functionality expressed as software components which may be combined or divided to accommodate different implementations. A game GUI 718 (e.g., 100) can present 2D photographs in which virtual objects are mapped, prompt users for input, and provide users with visual, audio and haptic feedback based on their input, for instance. In various implementations, the GUI is implemented as an Adobe Flash presentation (the Adobe Flash Player is available from Adobe Systems Incorporated of San Jose, Calif.) however other implementations are possible. An input model component 716 interprets user input from one or more input devices as signals. For example, computer mouse input could be interpreted as a golf club backswing signal, a forward swing signal, or a directional signal for pointing a golf club head towards a target such as a golf hole. Signals from the input model 716 are provided to GUI 718 which can, in turn, provide feedback visual, audio, haptic feedback, or combinations of these. By way of illustration, as a user provides input to swing the virtual golf club 112 (see FIG. 1), the virtual club 112 is shown swinging, visual meter 145 is dynamically updated to reflect the progress of the swing, and the user hears the sound of a golf club swing.
  • [0181]
    Additionally, the signals can be provided to a server communication component 730 which is responsible for communicating with a server 704. The communication component 730 can accumulate signals over time until a certain state is reached and then, based on the state, send a request to the server 704. For example, once input signals for a complete swing have been recognized by the server communication component 730, a request to the server is generated with information regarding the physical parameters of the swing (e.g., force, direction, club head orientation). In turn, the server 704 sends a response to the client 702 that can include a virtual object's path through the virtual course based on the physical parameters of the swing, 2D photographs required to visually present the path by the GUI 718, course terrain information, course masks, game assets such as sounds and haptic feedback information, and other information. The response can be broken into one or more individual messages. In addition, some information can be requested by the client 702 ahead of time. For example, the client 702 can pre-fetch photographs, course terrain information and course masks for the next hole of golf from the server 704 and store them in a photo cache 706 b, terrain cache 706 c, and course mask cache 706 d, respectively.
  • [0182]
    FIG. 7E is an overhead view of an example virtual course illustrating cells 703 a-m on a virtual terrain and along a virtual object path 709 (shown in red) that lies partly above (i.e., in the air) the terrain and partly on the terrain (711) and passes through the cells (e.g., in a path above, on, or below the terrain). A path is an ordered sequence of 3D positions in the virtual course. The path 709 begins at position 705 (e.g., the tee) and ends at position 707. FIG. 7F is a profile view of the example virtual object path 709 in relation to the course terrain 501. As is shown, a portion 713 of the path 709 lies above the terrain 501 and corresponds to when the virtual object is in the air. Each position is within at least one cell for the virtual course since there can be more than one layer of cells for the virtual course. Adjacent positions can be within the same cell or different cells. The distance between adjacent positions in the virtual course can be dependent on the desired resolution of the virtual object's movement or other factors such as cell density. For example, where cell density is high, adjacent positions can be closer to one another or vice versa. Alternatively, the distance between adjacent positions in the virtual course can be a function of the acceleration or speed of the simulated virtual object's movement in the virtual course. Other ways for determining the distance between positions are possible.
  • [0183]
    The client 702 includes a shot selector component 720 for determining an ordered sequence of photographs (“shot sequence”) that will be presented in the GUI 718 based on photographs of cells that are on or about the path. Cells that are about the path are cells that the virtual object does not pass through but whose associated photographs manage to capture a portion of the virtual object's path through another cell. In various implementations, a shot sequence is created automatically using one or more photographs capturing one or more cells on the path presented along with a static or animated representation of the virtual golf ball mapped from its 3D virtual course positions(s) to corresponding 2D photograph positions(s). The shot sequence presents the photographs in order as though cameras were following the ball from the moment the ball is hit, as it flies through the air, and as it rolls to a resting place on the fairground. The movement of the ball within a photograph is simulated based on the path and the course terrain.
  • [0184]
    In various implementations, if there is more than one photograph that can be used to show a particular portion of a path (or substantially the same portion of the path), the photograph with the highest priority is selected for the automatically generated shot sequence. Photograph priority is based on one or more factors which are described in TABLE 1. However, other factors are possible.












  • TABLE 1
    PRIORITY
    FACTORDESCRIPTION
    Path location in aThe photograph showing the path closer to the center
    photograph.of the photograph is given higher priority.
    Length of path inThe photograph which shows the longest length of a
    a photograph.path portion is given higher priority.
    Field of view forPhotographs having large fields of view are favored for
    a photograph.situations where the ball would be rolling in a
    photograph. In yet another alternative, photographs
    with smaller fields of view are favored for putting
    greens, for example.
    Landmark inIf there is a course landmark such as a building or a
    photographhazard, photographs showing the landmark are given
    higher priority.

  • [0185]
    During presentation of the shot sequence users can override which photographs are being shown and select different photographs instead. By way of illustration, if the currently displayed photograph is a ground-based shot of a portion of the path, a user can select an overhead shot of the same portion of the path (e.g., by selecting an overhead camera icon in the GUI 100). In this way, a user can interactively override and dictate a shot sequence. A user can override the entire shot sequence or a portion of the shot sequence. In the later case, the shot sequence will resume to the automatically created shot sequence once the user is no longer overriding.
  • [0186]
    In various implementations, a shot sequence is created automatically using scripts (e.g., shot scripts 706 a), rules or heuristics to select the shot sequence's photographs based on the virtual object path. Such a shot sequence can be generated automatically based on one or more approaches which are described in TABLE 2. Other approaches for generating shot sequences are possible.





























  • TABLE 2
    SHOT
    SEQUENCE
    GENERATED
    AUTO-
    MATICALLY
    BASED ONDESCRIPTION
    Virtual objectFor example, if a given portion of the path is in the
    locationair (i.e., the ball is in flight), overhead photographs of
    that portion of the path are favored over ground-based
    photographs. Whereas if a portion of the path is
    nearing impact with the course terrain, ground-based
    photographs are favored. If the path terminates at or
    near a hole, an overhead shot of the hole is selected.
    If the path comes close to or intercepts a hazard, a
    photograph with a large field of view is selected
    followed by a photograph showing a close up of the
    ball interacting with the hazard's sand or water.
    Prior userAs a given user interacts with a shot sequence
    behaviorpresentation by overriding which photographs are
    shown for different portions of a path, the client 702
    can learn the user's preferences and based on these
    determine a shot sequence that will satisfy the user.
    Prior groupAs a group of users interact with a shot sequence
    behaviorpresentation by overriding which photographs are
    shown for different portions of a path, the client 702
    can learn the group's preferences and based on these
    determine a shot sequence that will satisfy a user who
    is a member of the group.
    ScriptBased on a path's starting position, ending position,
    intermediate positions, and other factors, a script
    706a can dictate which photographs are selected for
    the shot sequence. For instance, the script might
    dictate that for positions along a fairway only certain
    pre-selected photographs are used in the shot
    sequence.

  • [0187]
    FIG. 7G is a flowchart 715 illustrating an example technique for shot selection. This technique can be performed by the client 702 or by the server 704, for instance. A three-dimensional path through a virtual course is determined by a simulation or other means (step 717). The virtual course includes a model of a physical terrain for a physical course. The terrain model is used to determine how a virtual object interacts with a virtual course. A determination is made as to which areas of the physical course areas are on the path (step 719). A sequence of photographs is then automatically selected, as described above, which have a view of the course areas on the path (step 721).
  • [0188]
    With reference again to FIG. 7D, a photo mapper component 722 maps virtual objects in the 3D virtual course to 2D photographs in a shot sequence, as described above in regards to FIGS. 6A-B. The photo mapper component 722 utilizes a visibility detector component 728 to determine whether a virtual object being mapped to a photograph would be visible to the camera. The visibility detector 728 can determine if the virtual camera 603 is unable to see a virtual object due to the object being hidden by the course terrain 501 (706 c), such as when a golf ball rolls into a valley or flies over the horizon line. A second way the visibility detector 728 determines if a virtual object is hidden is based on course bitmap masks (706 d), as described above. If a virtual object is determined to be hidden, the photo mapper 722 will not show the virtual object in the photograph.
  • [0189]
    An animation engine component 726 is responsible for animating movement of virtual objects in 2D photographs, such as animating the swing of the avatar 104 and club 112, or animating the golf ball as it flies in the air, collides with objects, and rolls on the ground. The animation engine 726 determines a series of locations for the golf ball in a photograph based on the ball's path through the virtual course. In various implementations, the locations in the photograph can be determined by interpolating between the path positions and mapping the positions to the photograph's coordinate system (e.g., by utilizing the photo mapper 722). Once the series of positions is determined, the golf ball can be animated by rapidly redrawing the golf ball at each position in the series of positions so that the optical illusion of ball movement is created in the viewer's mind. Other objects can be added to a photograph and animated including the movement of a golf flag in the wind, ripples on water, or movement of water such as a waterfall, for example. By way of further illustration, a simulated flock of birds can be added to a photograph such that the flock's animated flight occurs at random times.
  • [0190]
    A special effects component 724 can be used to enhance photographs by performing image processing to alter the lighting in photographs to give the appearance of a particular time of day, such as morning, noon or evening. Other effects are possible including adding motion blur for virtual objects animated in photographs to enhance the illusion of movement (e.g., the swing of the golf club 112 and the flight of a golf ball 108), shadows, and panning and tilting the virtual camera 603 for effect based on where the ball travels in the photograph to add drama. By way of illustration, the special effects component 724 can tilt the virtual camera up after ball is struck by the virtual club 112 to emphasize the rise of the ball 108.
  • [0191]
    Sometimes it may be advantageous to combine two or more photographs into a single continuous photograph, such as when the “best” photograph for a virtual object would be a combined photograph, to provide a larger field of view than what is afforded by a single photograph, or to create the illusion that users can freely move through a course. In some implementations, an image stitcher component 727 can combine two or more photographs into a continuous image by aligning the photographs based on identification of common features, stabilizing the photographs so that they only differ in their horizontal component, and finally stitching the images together. The image stitcher 727 can be utilized by the photo mapper 722 or the shot selector 720 to combine photographs.
  • [0192]
    FIG. 7H is a schematic diagram of an example server 704. The server includes a client communication component 723 which is responsible for accepting requests from clients 702 and providing responses that satisfy those requests. By way of illustration, a request from a client 702 for the path of a virtual object in a virtual course can include parameters that characterize the user's swing of a virtual golf club. The corresponding response to this request would be the path of the virtual golf ball in the virtual course and, optionally, a set of photographs 706 b, terrain information 706 c and course bitmap masks 706 d for areas of the physical course that capture the path of the virtual golf ball. Alternatively, some or all of the information relevant the path can be obtained in separate requests by the client which allows the client to pre-fetch information to improve responsiveness. A given request or response results in the transmission of one or more messages between a client 702 and the server 704.
  • [0193]
    A state management component 729 maintains the current state of the virtual universe for each user interacting with the server 704 through a client 702. A state includes user input and a set of values representing the condition of a virtual universe before the user input was processed by the game engine 725. The set of values include, for example, identification of virtual objects in the virtual universe, the current location, speed, acceleration, direction, and other properties of each virtual object in the virtual universe, information pertaining to the user such as current skill level, history of play, and other suitable information. The state is provided to the game engine 725 as a result of receiving a request from a client 702, for example.
  • [0194]
    The game engine 725 determines a new virtual universe condition by performing a simulation based on user input and a starting virtual universe condition. In various implementations, the game engine 725 models the physics of virtual objects interacting with other virtual objects and with a course terrain in a simulated game of golf and updates the user's virtual universe condition to reflect any changes. The game engine utilizes a collision detector 732 and surface types 706 e for modeling the collision and interaction of virtual objects, as described above.
  • [0195]
    A replay system component 730 allows users to “replay” portions of game play and share such with others. This feature is useful when users want to show others how they made a difficult shot, for instance. A client management component 734 maintains for each user a history of states (provided by the state management component 729) and corresponding identifiers. In various implementations, results transmitted to clients can include an identifier of the state that corresponds to the user input and prior values for the virtual universe that were provided to the game engine 725 to create the results. The identifier can be a sequence of letters, numbers or symbols, for example. In some implementations, the identifier is a uniform resource locator (URL). The identifier can be provided to the server's replay system 730 by a client 702 or other process in order to “replay” a simulation. The replay component 730 uses the identify to locate the corresponding state and then provides the state to the game engine 725, resulting in a “replay” of the user input for the state. The identifier can also be shared among users through electronic mail, instant messaging, or other means.
  • [0196]
    FIG. 7I is a flowchart of an example method 750 for replaying a simulation. A prior state of a virtual universe is selected from a plurality of prior states based on a received identifier by the replay system 730, the prior state including user input previously provided to the electronic game and a set of values representing the condition of the virtual universe before the user input was processed by the game engine 725 (step 752). The current state of the electronic game is set according to the prior state by the replay system 730 (step 754). A new state of the virtual universe is obtained based on processing of the user input by the game engine 725 and the set of values (step 756). Alternatively, the new state is merely obtained from the client management component 734 as the state following the prior state in history of states. A sequence of photographic images based on the new state is selected (step 758).
  • [0197]
    The game engine 725 includes various workings for modeling the physics of the virtual golf ball travel (e.g., flight, impact, bounce, roll) in the virtual course. Hereinafter, the virtual golf ball will be referred to as merely the ball. In various implementations, forward Euler integration is used to simulate discrete time steps during simulation of ball movement in the virtual course. At each step, the current dynamic model will calculate velocities and accelerations and apply them linearly over the interval of the step size. In further implementations, fourth order Runge-Kutta method for integration can be used.
  • [0198]
    The time step defines the amount of time that is simulated by each step of the integrator in the game engine 725. The choice of time step balances accuracy with computational complexity: a smaller time step reduces the error introduced by the integration function but increases the number of simulation steps required. If a maximum velocity for the ball is assumed, the choice of time step can be used to limit the distance traveled by the ball during each simulation frame. The time step resolution on the client 702 and the server 704 should be the same so that calculated trajectories of virtual objects are identical.
  • [0199]
    The ball model has a radius and a mass. The United States Golf Association (USGA) rules specify the minimum diameter of the ball as 1.68 inches (0.0427 meters). A British ball is slightly smaller, with a diameter of 1.62 inches (0.041 meters). These correspond to radii of 0.02135 meters and 0.02055 meters, respectively. The USGA rules specify the maximum weight of the ball as 1.62 oz (0.04593 kg). The ball also has a moment of inertia which is a scalar quantity, measured in kg m2, which describes the ball's inertia with respect to rotational motion about its central axis. If the ball is modeled as a solid sphere of uniform density, the moment of inertia is given by the following equation:
  • I =    2 5    MR 2   =  8.3743 ·  10  - 6      
  • [0200]
    The actual moment of inertia varies, generally depending on how the ball was constructed and how it is designed to behave. The coefficient of restitution is a dimensionless constant that describes the amount of momentum lost when the golf ball collides with a solid surface due to deformation, heat, sound, etc, and can be represented as a function of the impact speed. The following equation is for the coefficient of restitution of a golf ball colliding with a club face: e=0.86−0.0029νi, where vi is the impact speed.
  • [0201]
    The coefficient of lift is a dimensionless constant that describes the amount of lift force generated by a golf ball. It is used by the flight model. It is parameterized by the velocity of the ball through the air and the spin rate of the ball. The coefficient of drag is a dimensionless constant that describes the amount of drag force generated by a golf ball. See description of coefficient of lift, above, for more details. The coefficient of friction describes how much resistive force is generated by sliding a golf ball along a surface. This value is used by the clubhead impact model and the rolling model.
  • [0202]
    The clubhead model assumes that friction is sufficient to cause the ball to begin rolling before leaving the clubhead. The coefficient of friction is estimated at 0.40, although this can vary. Ball position is a vector quantity, measured in meters. Ball velocity is a vector quantity, measured in meters per second. Velocity ranges from a maximum of about 75 m/s for a drive by a professional golfer to about 26 m/s at the end of a drive to 1.63 m/s for the maximum speed that can be captured by the hole when aimed directly at the center. The angular velocity of the ball is a vector quantity, where the direction defines the axis of rotation and the magnitude defines the rotational speed, in radians per second.
  • [0203]
    The position, velocity, and angular velocity of the ball are stored in the inertial reference frame (i.e. relative to the course terrain), though dynamic models may shift it into other frames of reference to simplify certain calculations.
  • [0204]
    There are two generally types of golf balls: two-piece versus three-piece (or wound) balls. Two-piece balls are made from a solid core with a durable synthetic cover. They are less expensive and more durable than three-piece balls. Because of the harder cover, they tend to travel farther and spin less than three-piece balls. Three-piece balls are made from a solid or liquid core, surrounded by a rubber winding and wrapped in a softer “balata” cover. The softer cover is susceptible to nicks and cuts, which makes the balls wear faster. Three-piece balls don't travel as far as two-piece balls, but the soft cover allows them to achieve higher spin rates at launch and hold the green better upon landing. Two-piece balls have a higher moment of inertia, lower coefficient of friction, and higher coefficient of restitution. Three-piece balls have a lower moment of inertia, higher coefficient of friction, and lower coefficient of restitution.
  • [0205]
    A club model includes a clubhead mass which is a scalar quantity, measured in kg. Clubhead mass can also be estimated from the swing weight of the club. Loft is a scalar quantity that describes the angle the clubface forms with the vertical, measured in radians. A club with low loft has a nearly perpendicular face, like a driver or a putter. Irons and wedges have very high lofts, which imparts generates a higher trajectory with more backspin.
  • [0206]
    The coefficient of restitution describes the amount of momentum lost during the clubhead's impact with the ball. The clubhead's coefficient of restitution has a minor effect compared to the ball's coefficient. Some clubs incorporate a feature known as “spring-like effect”, where the club face is designed to deform and return energy to the ball upon launch. Spring-like effect is modeled as a constant positive percentage modifier to the ball's coefficient of restitution.
  • [0207]
    Shaft length is a scalar quantity describing the distance of the clubhead from the grip, measured in meters. This value is used by the swing model to determine clubhead speed. A longer shaft generally increases clubhead speed at the expense of accuracy.
  • [0208]
    An atmosphere model uses data found in a typical weather report to calculate the atmospheric density, which is used in the flight model to calculate drag and lift. It also models the presence of wind. Pressure is a scalar quantity, measured in millibars (mbar). Temperature is a scalar quantity, measured in degrees Celsius (C).
  • [0209]
    Humidity describes the quantity of water vapor present in atmosphere. It can be specified as either relative humidity or dew point. Relative humidity describes the amount of water vapor present relative to the total amount the air can hold at the current temperature (the saturation pressure). Dew point describes the temperature at which the current amount of water vapor would completely saturate the air. Dew point has the advantage of remaining constant despite shifts in the ambient temperature.
  • [0210]
    Density expresses the amount of mass per unit volume, measured in kg/m3. The density is calculated from the input values for pressure, temperature and humidity using the following equation:
  • D =   (   P d    R d  ·  T K    )  +  (   P v    R v  ·  T K    )    
  • where D=density (kg/m3)
  • [0211]
    Pd=pressure of dry air (Pascals)
  • [0212]
    Pv=pressure of water vapor (Pascals)
  • [0213]
    Rd=gas constant for dry air=287.05 J/(kg*deg K)
  • [0214]
    Rv=gas constant for water vapor=461.495 J/(kg*deg K)
  • [0215]
    T=temperature (deg K)=deg C+273.15
  • [0216]
    The saturation pressure of water vapor can be calculated for a given atmospheric temperature using the following equation:
  • E s  =   c 0  ·  10    c 1  ·  T c     c 2  +  T c        
  • where Es=saturation pressure of water vapor (mbar)
  • [0217]
    Tc=temperature (deg C)
  • [0218]
    c0=6.1078
  • [0219]
    c1=7.5
  • [0220]
    c2=237.3




Last edited by Paul on Wed 30 Mar 2022, 11:40 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Electronic game utilizing photographs PART 4

Post by Paul Sun 27 Mar 2022, 10:58 pm

  • 0221]
    The pressure of water vapor, Pv, can be calculated from the dew point by simply substituting the dew point in the equation above. To calculate the pressure using relative humidity, the saturation pressure for the current temperature is calculated and multiplied by the relative humidity percentage. Finally, the pressure of the dry air, Pd, can be calculated by subtracting the pressure of water vapor from the absolute pressure. Substituting the values for Pd and Pv into the first equation yields the atmospheric density. The reference value for atmospheric density is 1.2250 kg/m3, which assumes dry air at a pressure of 1013.25 mbar and temperature of 15 deg C.
  • [0222]
    Wind is represented as a function of time and position which returns a vector quantity indicating the direction and speed of the wind in meters per second. Wind direction and speed may vary with time, but it is assumed that the wind is the same everywhere on the course. In the real world, wind speed usually decreases close to the surface of the ground. This model builds on the previous model by defining a height below which the wind vector is scaled linearly to zero. This implies a dependency from the atmospheric model on the height map. Wind is often shaped by local geographic features, like hills or valleys. These features may affect not only the wind speed, but also its direction. To represent the local variations, a wind vector can be stored for each point on the hole. Such a vector field can be implemented by placing an image map over the height map for the hole and using the three channels of the image map to represent the components of the wind vector along each axis.
  • [0223]
    The encoded vectors could represent absolute wind vectors or a relative offset from a global wind vector. Each vector field would be closely tied to a prevailing wind direction. (Consider, for example, the wind shadow cast by a hill.) The underlying wind speed and direction can be driven by a noise function, parameterized by time. The inputs to the noise function should allow course designers to specify a prevailing wind direction and speed and a range around each. This will be implemented using either a random walk with shaped probabilities or a Perlin noise function.
  • [0224]
    A course model uses a height map which is a bitmap image with greyscale color values to define a regular grid of elevation samples corresponding the course terrain or topology. This elevation data will be interpolated using either bilinear or bicubic interpolation.
  • [0225]
    The lie describes how far the ball has sunk into the surface of the course. It will be measured in meters or as a percentage of the ball's radius. A deeper lie requires the clubhead to dig deeper into the surface material of the course, which reduces the clubhead speed at impact. Also, a deeper lie raises the point of impact between ball and clubface, which affects spin rate and launch angle. The effect of lie will depend on the particulars of the swing and clubhead impact models, and may require additional work.
  • [0226]
    The swing model describes how the golfer swings the club. Inputs include variables from the GUI (player input), swing type, club parameters, and any game stats for the golfer. The primary output of the swing model is a set of dynamic parameters for the club in the instant that it hits the ball. These include clubhead speed and direction, impact point on the ball and clubhead and the dynamic loft of the clubface. These parameters are fed into the clubhead impact model, which generates the initial conditions for the trajectory of the ball.
  • [0227]
    In various implementations, swing is modeled as a double pendulum composed of the golfer's arms and club. Forces, torques and couples are applied to the double pendulum to generate the final motion of the clubhead at impact. While the double pendulum model offers interesting insights into how to improve a golfer's swing, it's not the best model for a game. The connection between input variables and output variables is not intuitive at all.
  • [0228]
    In other implementations, a results-based model that allows the parameters to be set directly. The golfer will have a maximum power, which represents either the maximum clubhead speed (for maximum clarity) or the amount of work the golfer is able to do with a club (e.g., to adjust for clubhead weights and shaft lengths.)
  • [0229]
    The purpose of the swing model is to compute the initial parameters of a golf ball's trajectory after being struck with a club. The model has two main phases. The first phase determines the position, velocity, and orientation of the clubhead at impact based on player inputs, as well as equipment and environmental parameters. This phase is further subdivided into three separate models to represent the physical swing motion, the presence of golfer error, and interactions of the club with the ground.
  • [0230]
    After the first phase, the state of the clubhead is completely described and the second phase begins. Here, the impact between the clubhead and ball is modeled as a rigid-body collision. From the collision model, the linear and angular velocity of the golf ball can be determined.
  • [0231]
    The trajectory of the golf ball is completely determined by two vector quantities: linear velocity and angular velocity. The linear velocity describes the motion of the ball's center of mass, while the angular velocity describes the rotational motion. (The direction of the angular velocity vector gives the axis of rotation and the magnitude gives the speed of rotation.) Subsequent behavior of the ball during flight is determined by atmospheric interactions like lift and drag, but the overall trajectory is completely determined by these two initial vectors. Taken together, they can describe any possible draw, fade, hook, slice, etc.
  • [0232]
    TABLE 3 below gives sign and rough magnitude for both deflection and sidespin for some common ball trajectories. Since a right-handed coordinate system is used, positive angles and rotations are counter-clockwise. Positive horizontal deflection is a pull, while negative is a slice. Positive sidespin causes a hook, while negative sidespin causes a slice.
  • TABLE 3
    TRAJECTORYDEFLECTIONSIDESPIN
    Straight00
    FadePositive (small)Negative (small)
    Hook0Positive (medium)
    PullPositive (medium)0
    Push-HookNegative (medium)Positive (large)
    Pull-HookPositive (large)Positive (medium)
    DrawNegative (small)Positive (small)
    Slice0Negative (medium)
    PushNegative (medium)0
    Pull-SlicePositive (medium)Negative (large)
    Push-SliceNegative (large)Negative (medium)

  • [0233]
    The common golfing terms can be related to the vector velocities by defining an appropriate coordinate frame and using some basic trigonometry. If ν represents linear velocity, ω represents angular velocity, and the target, or aim point, is on the x axis, the following relationships hold:
  • Launch       speed  =    v   =    v x 2  +  v y 2  +  v z 2          Launch       angle  =  θ =      sin  - 1     (   v z    v    )   .      Horizontal        deflection  =  ϕ =   tan  - 1     (   v y   v x   )          Backspin =  -  ω y       Sidespin =  ω z    
  • [0234]
    The purpose of the arm model is to use player inputs, equipment parameters and surface parameters to compute the velocity and orientation of the clubhead at impact. The arm model assumes a perfect swing; this assumption is later revised by the outputs from the error model before entering into the collision response model.
  • [0235]
    In physics, the golf swing is typically modeled as a double pendulum. The lower pendulum represents the club, while the upper pendulum represents the golfer's arms. At the end of the swing, immediately before impact, both pendulums are relatively aligned with similar velocities. In various implementations, the double pendulum model is collapsed into a single pendulum model, consisting of the shaft combined with the arms. Using this model, reasonable approximations for the state of the clubhead just prior to hitting the ball can be determined.
  • [0236]
    To further simplify the model, the calculation of swing speed is based on a reference swing with known equipment. By calculating the difference between the current equipment and that used for the reference swing, the difference between the swing speeds can be calculated. This avoids a more complicated model of muscles and joints or torques and couples.
  • [0237]
    The geometry of the arm model uses the concept of the swing plane. This is an imaginary plane defined by the line from the ball to the target and the line from the ball to the golfer's shoulders. On a good swing, the clubhead stays within this plane during its entire arc. The motion of the clubhead near impact can be visualized as following a large circle, tilted to pass through the golfer's shoulders. The radius of this circle is determined by adding the golfer's arm length and shaft length. The arm length can be specified directly, or computed using a formula based on the golfer's height (16.1 times height in inches divided by 72).
  • [0238]
    The tilt of the swing plane depends on the terrain. If the ball is on a flat surface, the tilt is roughly equal to the lie angle of the club. A sidehill lie, however, can increase or decrease this angle. If the ball is higher than the golfer's feet, the swing plane becomes more horizontal. If the ball is lower than the golfer's feet, the swing plane becomes more vertical.
  • [0239]
    The angle of the lie is determined by sampling the golf course elevation at three points, corresponding to the ball and the golfer's left and right feet. Foot position is determined by calculating the offset of the feet from the aim line (cosine of lie angle times sum of arm length and shaft length) and assuming a stance width of two feet. Taking the cross product of the vector from ball to left foot and the vector from the ball to the right foot gives the normal of the triangle, from which can be computed both uphill and sidehill lie angles. As noted above, for a sidehill lie, the golfer adjusts by tilting the club up or down to match the difference in elevation. For and uphill or downhill lie, however, it can be assumed that the golfer attempts to keep his body prependicular to the slope. The swing arc, therefore, is tilted along the aim line to match the slope of the ground.
  • [0240]
    The swing arc model obviously breaks down for extreme lie angles. For example, consider playing a shot with the ball on the lip of a coffin bunker, with the aim line perpendicular to the lip. The lie angle would be computed as an extreme uphill lie, and the assumption that the golfer's body remains perpendicular to the slope would require him to lean more than forty-five degrees to the right. This is clearly unrealistic.



Last edited by Paul on Wed 30 Mar 2022, 11:39 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Electronic game utilizing photographs PART 5

Post by Paul Sun 27 Mar 2022, 11:00 pm

0241]
  • The forward and back position of the ball in the stance determines the point in the swing arc where the clubhead makes contact. In various implementations, ball placement is defined relative to the low point in the swing arc, which moves depending on the type of swing. Placing the ball behind the low point causes the clubhead to strike it while the clubhead is still descending, while placing the ball ahead of the low point causes the clubhead to strike it when the clubhead is ascending.
  • [0242]
    In various implementations, ball placement, measured in units of distance, is converted into an angular measurement, using the radius of the swing arc. In the discussion below, this angle is called theta. Theta is positive when the ball is moved forward and negative when the ball is moved backward, consistent with our right-handed coordinate system.
  • [0243]
    The velocity of the clubhead at impact is based on its speed and direction. As mentioned above, clubhead speed is computed based on the reference swing. Direction is determined by the tangent of the swing arc at the point where the clubhead contacts the ball.
  • [0244]
    The reference speed provided for the golfer is his swing speed with a standard driver. This assumes a shaft length of 44 inches and a clubhead mass of roughly seven ounces. From the swing speed and radius, the angular velocity can be calculated in radians per second. In various implementations, it is assumed that this angular velocity is constant for all shaft lengths and has an inverse linear relationship with clubhead mass. (i.e., the same golfer swings a heavier clubhead more slowly than a lighter one.) Multiplying the angular velocity by both the shaft length and the ratio of current clubhead mass to reference mass gives us the clubhead speed at impact.
  • [0245]
    The clubhead velocity vector can be determined by calculating the direction of the tangent of the swing arc at theta and multiplying by the speed.
  • [0246]
    The orientation of the clubhead is determined by several factors. Some are controlled directly by the player, while others result from equipment or environmental conditions. Note that orientation in this section refers to the rotation of the entire clubhead, rather than the clubface, which is affected by things like loft, bulge and roll.
  • [0247]
    The most significant input is the swing arc, which incorporates the player's chosen aim line. For a perfect swing on level ground, the clubhead is presented in a level and neutral—neither open or closed—orientation, perpendicular to the aim line. The other player inputs act as modifiers to this basic stance.
  • [0248]
    Ball placement modifies the position in the swing arc where contact is made. If theta is negative, the clubhead will be tilted downward and opened slightly. If theta is positive, the clubhead will be tilted upward and closed slightly.
  • [0249]
    Opening or closing the stance will affect the z-axis of the clubhead, turning the clubface across the line of motion. Another option is open or closing the club itself, by rotating the handle. This affects both vertical and horizontal rotations of the clubhead.
  • [0250]
    Additional inputs, not controlled by the player, also affect the orientation. One major factor is shaft flex. At the start of the downswing, the flexible shaft bends backwards as the hands accelerate the heavy mass of the clubhead downward. Near the end of the downswing, however, the golfer's wrists release, transferring energy from arms and wrists into the clubhead. This slows the hands, relative to the clubhead, and causes the shaft to flex the opposite direction, which tilts the clubhead upward. This tilt causes the “dynamic loft” of the clubhead to be several degrees greater than clubface loft.
  • [0251]
    Shaft flex is modeled based on the mass and velocity of the clubhead, however other models of shaft flex are possible.
  • [0252]
    Lastly, the angle of the lie can affect the orientation of the clubhead. A sidehill lie tilts the swing plane, which affects the heel-toe level of the clubhead. Since the swing arc is defined in relation to the surface, uphill and downhill lies affect the tilt of the clubhead in world coordinates.
  • [0253]
    The purpose of an error model is to represent deviations from the perfect swing. The error model combines inputs from the swing meter and attributes from the gaming system to determine the type and amount of error to introduce. The error model generates a set of modifiers that are applied to the outputs from the arm model to determine the actual state of the clubhead just before striking the golf ball.
  • [0254]
    A golf swing is a complicated motion with many opportunities for error. Trying to model individual errors during the swing would be prohibitively complicated, as well as difficult to tune and control. Fortunately, almost all errors can be grouped into a relatively small number of categories based on their effect on the impact between club and ball. Instead of modeling individual errors in the swing, the resulting effects are modeled directly. The major error types are detailed in TABLE 4.





















  • TABLE 4
    ERROR TYPEDESCRIPTION
    Speed ErrorThe golfer may swing the club more quickly or slowly than he intended.
    This will primarily affect the launch angle and
    distance of the shot, with a secondary effect on spin. Speed error
    can be measured as a percentage of desired club speed.
    Directional ErrorThe golfer may swing the clubhead through the ball in a slightly
    different direction than he intended. Assuming the face is kept
    square to the direction of motion, the resulting shot will either be
    pushed or pulled. Directional error can be measured in degrees,
    with positive to the left.
    Orientation ErrorThe golfer may fail to align the clubface squarely with the direction
    of motion. This produces sidespin, which will result in
    a hooking or a slicing trajectory. Orientation error is measured
    in degrees of rotation away from square, relative to the direction
    of motion. A positive value represents a closed clubface, while a
    negative value represents an open clubface.
    Positional ErrorThe golfer may hit the ball somewhere other than directly in the
    center of the clubface. This causes the clubhead to rotate during
    the impact, which robs the shot of power. Depending on the
    clubhead parameters, the ball may also be pushed or pulled, and
    sidespin may be produced. Positional error can be measured in
    meters.

  • [0255]
    Interactions between the clubhead and the ground can result in additional types of error. These types of interactions can be handled by the ground model.
  • [0256]
    There are two main sources of error in the swing model. The primary source of error is the swing meter. The secondary source of error is essentially random, intended to represent the inherent difficulty of properly executing a perfect swing. Random errors should be significantly smaller than those introduced by the swing meter, to keep players from feeling the game is too unpredictable or “cheating”. Both sources of error should decrease as the golfer becomes more experienced.
  • [0257]
    The swing meter is the primary interface for controlling a golf swing (see FIG. 7J). The location of the final click determines the types and amounts of error that are applied to the shot. This user interface element gives the player direct control over shots and provides clear, unambiguous feedback whether a swing was successful.
  • [0258]
    The types of errors described above suggest a basic set of player attributes. These could be further subdivided based on club type, surface type, etc. For the initial skill challenge, however, the gaming attributes will be directly linked to the error types.
  • [0259]
    The amount of error for each type is calculated based on the swing meter and a random input that simulates the normal probability distribution function. The regions of the swing meter between the points indicated by D are each represented as a number in [−1.0, 1.0]. The number corresponding to the region between the points indicated by Bin FIG. 7J is named S1, and the number corresponding to the region between the points indicated by C and D is named S2. To preserve continuity between regions, S1 has a magnitude of 1 when the S2 is non-zero. The random normal input ranges from [−1.0, 1.0] and is named R. Each gaming attribute consists of three coefficients, which are applied to S1, S2, and R to determine the final error amount using the formula error=k1*S1+k2*S2+k3*R.
  • [0260]
    This formula allows any error type to be linked to the swing meter and provides a simple linear range across each region of the swing meter. The linear relationship may need to be replaced by a curve, but the shape of the curve has not yet been specified. This should suffice for the skill challenge, but may need to be revised for the full game. The formula may also need to be expanded to include other terms-power, for example.
  • [0261]
    In various implementations, between A and B small amounts of directional error are added. This causes the shot to have a slight push/pull. Between C and D, the magnitude of directional error increases, and moderate amounts of orientation error are added as well to provide hook and slice. (The region between D and E can be handled by a special case.) This corresponds to the following coefficients in TABLE 5.

  • TABLE 5
    ERROR TYPE


    K1K2K3



    DirectionSmallSmall0
    Orientation0Moderate0

  • [0262]
    The purpose of a ground model is to represent interactions between the clubhead and the ground. The outputs of the ground model are a set of modifiers to the clubhead velocity and orientation, as well as clubface friction, based on the degree of contact between the clubhead and ground surface. The degree of contact is estimated using the trajectory of the clubhead and certain clubhead parameters. The relationship between inputs and outputs is defined for each different surface type.
  • [0263]
    The set of input variables to the model should allow similar choices as when playing a difficult lie in the real-world. For example, when hitting from deep or “nesty” lies, golfers are advised to “hit down” on the ball. Using a descending swing has two beneficial effects. First, the steep trajectory minimizes the amount of contact with the ground before the ball, which maintains clubhead velocity. Second, the steep trajectory minimizes the amount of grass or other material that can be pinched between the ball and clubface, which maintains clubface friction.
  • [0264]
    Each modifier can have its own formula with a different set of inputs. One common input is the amount of contact between the clubhead and the ball. This can be estimated using the depth of the ball's lie, the ball placement in the stance, and the swing arc, and normalize it to a range between zero and one, suitable for scaling other values. The output modifiers are described below in TABLE 6.






























  • TABLE 6
    OUTPUT
    MODIFIERDESCRIPTION
    Decrease inFriction between the clubhead and the ground causes the
    clubheadclubhead to slow down. Surfaces like sand and water cause
    velocityhigh amounts of drag, while other surfaces like fringe and
    rough cause relatively less drag. The amount of drag is
    generally proportional to the amount of contact between the
    clubhead and ground. For fluid and semi-fluid surfaces like
    water and sand, the shape of the clubhead also plays a role;
    low-lofted clubheads displace more material than thinner,
    higher-lofted clubheads, so low-lofted clubheads experience
    more drag. As with any fluid resistance, drag also increases
    with velocity, so a fast-moving clubhead will lose more
    velocity than a slower one. Input variables: amount of
    contact, clubhead loft, clubhead velocity.
    Change ofIf the clubhead is not level when it contacts the ground-on
    clubheada side hill lie, for example-it may experience uneven
    orientationamounts of drag. If the toe or heel contacts the surface more
    firmly than the opposite end, it will experience increased
    drag and the clubhead will rotate around the vertical axis.
    This can cause hooks and slices. Input variables: amount of
    contact, clubhead orientation, clubhead velocity.
    Decrease inFriction between the clubface and ball can be reduced when
    clubfacehitting from deep, juicy roughs. As the clubhead passes
    frictionthrough the grass, moist debris may accumulate on the face
    of the club. This debris lubricates the clubface, decreasing
    friction. The decreased friction causes the ball to leave the
    clubface with less backspin, which affects green-holding; it
    also may slightly increase the height of the trajectory.
    Grooves on the clubface help to trap this debris, which
    keeps the clubface clear and maintains friction, Input
    variables: amount of contact, clubhead grooves.

  • [0265]
    The ground model currently does not include a modifier to represent clubhead bounce on hardpan lies. This can be added if desired, but it introduces a level subtle variability that may not be understood by players.
  • [0266]
    The purpose of a collision response model is to calculate the linear and angular velocity for the ball after being hit with the club. The model combines the outputs from the arm model, error model, and ground model to determine the position, orientation and velocity of the clubhead just before impact. The impact between ball and club is modeled as a rigid body collision. Both club and ball are treated as free bodies, which allows us to apply the conservation of momentum and Coulomb's friction laws to determine a reasonable approximation of the physical state after the collision.
  • [0267]
    The impact between a golf club and a golf ball is a remarkable violent event. When driving from the tee, for example, the clubhead, which is traveling somewhere between 70 and 120 miles per hour, strikes a stationary ball. The ball compresses against the clubface then springs back, launching the ball at speeds in excess of 150 miles per hour. The entire collision lasts only half a millisecond, during which the force between the clubface and ball averages 1400 pounds.
  • [0268]
    Because the clubface is tilted, the ball also starts sliding up the clubface during the collision. This sliding generates a frictional force, applied tangentially at the contact point in the opposite direction of the sliding. The friction causes the ball to rotate. If the combination of normal force and coefficient of friction is high enough, the ball will begin to roll before it leaves the clubface. This rotation causes backspin.
  • [0269]
    If the clubface is not aligned with the direction of motion, the tangential velocity will have a horizontal component as well. This horizontal component will cause the ball to rotate around a vertical axis, causing sidespin and a resulting hook or slice.
  • [0270]
    If the ball strikes the clubface off-center, the normal force between the clubface and ball will cause the clubhead itself to start to rotate. This rotation has several effects. First, it robs the shot of some power; energy is transferred into the angular momentum of the clubhead instead of the linear momentum of the ball. Second, the rotation turns the clubface in a new direction, which has a slight affect on the subsequent motion of the ball. Lastly, the rotation of the clubhead creates a tangential velocity between the ball and clubface. This tangential velocity causes a frictional force to arise, which causes the ball to spin in the opposite direction to the club. This is the so-called “gear effect”.
  • [0271]
    In addition to the vectors describing the velocity and position of the clubhead, the collision response model also uses the following physical properties of the club as described in TABLES 7 and 8.



























































  • TABLE 7
    CLUB VARIABLEDESCRIPTION
    MassThe clubhead mass is used to determine the total momentum of
    the system before the collision. Increasing the clubhead mass
    generates high launch speeds. It also stabilizes the clubhead
    against off-center hits. (See the arm model for further discussion
    of clubhead mass.)
    Moment of Inertia (MOI)The moment of inertia describes the mass distribution of the
    clubhead. This affects how easily the clubhead rotates in
    response to an off-center hit. Modern clubhead designs focus on
    pushing mass to the perimeter of the club to maximize the MOI.
    While technically this should be described using an inertial
    tensor, the MOI is distilled into a single number, ranging from
    0.0 to 1.0, describing the overall resistance of the clubhead to
    twisting, with 0.5 being neutral.
    Coefficient of restitution (CoR)According to Newton's model of collision, the coefficient of
    restitution is the ratio of the final relative velocity to the initial
    relative velocity. For the clubface, the CoR determines the
    amount of “spring-like effect” produced by the clubface. Clubs
    with spring-like effect have faces that deform on impact with the
    ball. Because deforming a thin, flexible clubface is more
    efficient than deforming a golf ball, clubs with a high CoR
    generate higher launch speeds. The coefficient of restitution
    ranges from 0.0 to 1.0, though the USGA rules specify a
    maximum of 0.830 for CoR. The clubface CoR is combined
    with the ball CoR to determine the effective CoR for the
    collision.
    LoftThis defines the angle between the clubface and vertical when
    the club is properly soled on a level surface. It is the primary
    factor in determining the launch angle. Low-lofted clubs, like
    drivers, produce low launch angles with relatively little spin.
    High-lofted clubs, like 9 irons and wedges, produce high launch
    angles with lots of backspin. The loft of the club is combined
    with the clubhead orientation to determine the surface normal for
    the collision.
    BulgeThis describes the radius of curvature for a horizontal slice of the
    clubface. A club with lots of bulge has a relatively small radius
    of curvature, while a club with little bulge has a relatively large
    radius of curvature. A club with no bulge has a flat clubface.
    Bulge causes a clubface to respond differently to off-center hits.
    Because the normal points away from the center of the club, off-
    center hits are aimed to the side. On a well designed club, bulge
    can be used to counter the spin caused by gear effect. Bulge
    could be considered as adding a sort of horizontal loft. The
    center of clubface is assumed to be neutral. (i.e., it does not
    angle to either side.) This is not true for clubs designed to
    correct persistent shot problems, like draw clubs.
    RollThis describes the radius of curvature for a vertical slice of the
    clubface. (It's like a vertical version of bulge.) Roll affects the
    effective loft for off-center collisions. On a rolled clubface, the
    loft varies depending on the vertical distance from the center of
    the clubface. At the center of the clubface, the loft is the
    nominal loft of the club. Above the center, the loft increases,
    while below the center, the loft decreases. Roll serves little
    purpose in club design, though it is common.
    FrictionThe coefficient of friction relates the amount of horizontal force
    opposing sliding to the normal force between the ball and the
    club. The clubface has a basic value for friction, which is
    combined with the ball's friction and ground model's friction
    modifier to determine the overall friction coefficient for the
    collision.

























  • TABLE 8
    BALL
    VARIABLEDESCRIPTION
    MassThe ball's mass is a major factor in determining launch
    speed.
    Moment ofThe moment of inertia describes the relationship
    inertia (MOI)between torque and rotation for the ball. In other words,
    it describes how easily the ball rolls. Different types of
    ball construction can have different moments of inertia.
    This is expressed as a constant multiplier to the MOI for
    a solid, uniform sphere with the ball's mass.
    Coefficient ofAs noted above, the coefficient of restitution describes
    Restitutionthe ratio between the final relative velocity and the
    (CoR)initial relative velocity in a collision. The ball's CoR is
    combined with the club's CoR to determine the dynamic
    CoR for the collision.
    FrictionThe coefficient of friction describes the relationship
    between the resistive force and the normal force as the
    ball slides along a surface. This also depends on the
    construction of the ball. Balls designed for shot shaping
    have softer, less durable covers with higher coefficients
    of friction to generate higher levels of spin. The
    coefficient of friction really only comes into play,
    however, on slower swings with higher lofted clubs, as
    the normal force during a drive is so great that
    practically any coefficient of friction is sufficient to start
    rolling.

  • [0272]
    In various implementations, the collision response model uses closed-form, algebraic equations to determine the collision impulse and resulting motion. Conservation of momentum and the Newtonian model of collision restitution are used to determine the collision impulse and final normal velocity. Coulomb's friction model is used to calculate the effect of tangential velocity on the ball during the collision.
  • [0273]
    The algorithm used by the collision response model follows that described by Penner, with several differences. First, the roll (vertical curvature) of the clubface is accounted for. Second, the assumption that the ball is rolling at the end of the collision is reasonable for loft angles below forty degrees, and simplifies the analysis somewhere, but is replaced in some instances. For the game, however, clubs are accurately modeled with higher degrees of loft, so the assumption that the ball is rolling at the end of the collision is replaced with a calculation to determine whether the ball is rolling or sliding at the end of the collision. Lastly, amore simplistic model of mass distribution is used.
  • [0274]
    The clubhead's impact with the ball can be modeled using existing techniques (see, e.g., Penner, A. R. “The physics of golf: The optimum loft of a driver,” American Journal of Physics 69 (2001): 563-568 and Penner, A .R. “The physics of golf: The convex face of a driver,” American Journal of Physics 69 (2001): 1073-1081). In various implementations, the assumption that the clubhead velocity has no sideways component is modified.
  • [0275]
    Implementations of the invention and all of the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the invention can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
  • [0276]
    The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.
  • [0277]
    A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • [0278]
    The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • [0279]
    Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor-memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • [0280]
    To provide for interaction with a user, implementations of the invention can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • [0281]
    Implementations of the invention can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the invention, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • [0282]
    The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • [0283]
    While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular implementations of the invention. Certain features that are described in this specification in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • [0284]
    Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
  • [0285]
    Thus, particular implementations of the invention have been described. Other implementations are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results.
  • [0286]
    All articles, publications and patents referred to herein are incorporated by reference for all purposes.




Last edited by Paul on Wed 30 Mar 2022, 11:38 pm; edited 1 time in total


Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Re: VEM VARIOUS PATENTS AND PURPOSES

Post by Paul Sun 27 Mar 2022, 11:09 pm



Paul



Please enjoy

_________________

May the SUN always be with you

home of

https://www.valleyofthesuncc.com/ an information and entertainment only website
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

The author of this message was banned from the forum - See the message

VEM VARIOUS PATENTS AND PURPOSES Empty Re: VEM VARIOUS PATENTS AND PURPOSES

Post by Paul Thu 12 May 2022, 8:32 am

While I'm catching my breath and wiping the sweat off I will take a moment to take a break here.
The information above is meant for a very small and infinitesimal audience. And was only copied and pasted from one particular Google patent.
As is the same as any information that I share on my website. It matters not that it interests someone who it doesn't interest but rather anyone that might.
The same can be said for anyone who chooses to post any information themselves. As it may be important to them it may not have any interest of all to anyone else.
Thank you for your response at all. It is much appreciated and you may continue to post anything you would like.

Thank you very much
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

VEM VARIOUS PATENTS AND PURPOSES Empty Re: VEM VARIOUS PATENTS AND PURPOSES

Post by Paul Thu 12 May 2022, 8:37 am

As for the story of My Life. There are bits and pieces of it posted throughout this website. And I will continue to post anything that I please.

Knowing of course it will not interest many. As it has been quite a few years since I have shared any of myself in this website.

This is my personal website. Used as storage for information and art and entertainment that I feel is save-worthy. I am not trying to gain any interest in users I could care less about any of that.

Again. Thank you for your input.
Paul
Paul
Admin
Admin

Posts : 42010
Join date : 2013-05-06

https://www.valleyofthesuncc.com

Back to top Go down

Back to top


 
Permissions in this forum:
You cannot reply to topics in this forum