Free Essay

Multimedia

In:

Submitted By ameya91
Words 4574
Pages 19
MPSTME, NMIMS

2012-2013
Multimedia
Ameya Dighe 162, Raghav Jaju 165, Ujjwal Kumar 177

Times have changed. People want to use the Internet not only for text and image communications, but also for audio and video services. We concentrate on applications that use the Internet for audio and video services.
1. INTRODUCTION
Audio & video services is divided in 3 parts: * streaming stored audio/video, * streaming live audio/video, * interactive audio/video

In the first category, streaming stored audio/video, the files are compressed and stored on a server. A client downloads the files through the Internet. This is sometimes referred to as on-demand audio/video. Examples of stored audio files are songs, symphonies, books on tape, and famous lectures. Examples of stored video files are movies, TV shows, and music video clips.

In the second category, streaming live audio/video, a user listens to broadcast audio and video through the Internet. A good example of this type of application is the Internet radio. Some radio stations broadcast their programs only on the Internet; many broadcast them both on the Internet and on the air. Internet TV is not popular yet, but many people believe that TV stations will broadcast their programs on the Internet in the future.

In the third category, interactive audio/video, people use the Internet to interactively communicate with one another. A good example of this application is Internet telephony and Internet teleconferencing.Example includes Skype or facebook video chats.
2. DIGITIZING AUDIO AND VIDEO
DIGITIZING AUDIO
When sound is fed into a microphone, an electronic analog signal is generated that represents the sound amplitude as a function of time. The signal is called an analog audiosignal. An analog signal, such as audio, can be digitized to produce a digital signal. According to the Nyquist theorem, if the highest frequency of the signal is f, we need to sample the signal 2f times per second. There are other methods for digitizing an audio signal, but the principle is the same.

Sampling | Voice | Music | Samples per second | 8000 | 44100 | Bits per sample | 8 | 16 | Result | 64kbps | 705.6 kbps for monaural1.411 MBPS for stereo |

Digitizing Video
A video consists of a sequence of frames. If the frames are displayed on the screen fast enough, we get an impression of motion. The reason is that our eyes cannot distinguish the rapidly flashing frames as individual ones. There is no standard number of frames per second; in North America 25 frames per second is common. However, to avoid a condition known as flickering, a frame needs to be refreshed. The TV industry repaints each frame twice. This means 50 frames need to be sent, or if there is memory at the sender site, 25 frames with each frame repainted from the memory.
Each frame is divided into small grids, called picture elements or pixels. For blackand-white TV, each 8-bit pixel represents one of 256 different gray levels. For a color TV, each pixel is 24 bits, with 8 bits for each primary color (red, green, and blue). We can calculate the number of bits in a second for a specific resolution. In the lowest resolution a color frame is made of 1,024768 pixels. This means that we need

2*25*1024*768*24=944 MBPS This data rate needs a very high data rate technology such as SONET. To send video using lower-rate technologies, we need to compress the video.

Compression is needed to send video over the Internet.
3. STREAMING STORED AUDIO/VIDEO
Now that we have discussed digitizing and compressing audio/video, we turn our attention to specific applications. The first is streaming stored audio and video. Downloading these types of files from a Web server can be different from downloading other types of files. To understand the concept, let us discuss three approaches, each with a different complexity.

First Approach: Using a Web Server

A compressed audio/video file can be downloaded as a text file. The client (browser) can use the services of HTTP and send a GET message to download the file. The Web server can send the compressed file to the browser. The browser can then use a help application, normally called a media player, to play the file. Figure below shows this approach. This approach is very simple and does not involve streaming. However, it has a drawback. An audio/video file is usually large even after compression. An audio file may contain tens of megabits, and a video file may contain hundreds of megabits. In this approach, the file needs to download completely before it can be played. Using contemporary data rates, the user needs some seconds or tens of seconds before the file can be played.

Second Approach: Using a Web Server with Metafile
In another approach, the media player is directly connected to the Web server for downloading the audio/video file. The Web server stores two files: the actual audio/video file and a metafile that holds information about the audio/video file. Figure below shows the steps in this approach.

1. The HTTP client accesses the Web server using the GET message.
2. The information about the metafile comes in the response.
3. The metafile is passed to the media player.
4. The media player uses the URL in the metafile to access the audio/video file.
5. The Web server responds.

Third Approach: Using a Media Server
The problem with the second approach is that the browser and the media player both use the services of HTTP. HTTP is designed to run over TCP. This is appropriate for retrieving the metafile, but not for retrieving the audio/video file. The reason is that TCP retransmits a lost or damaged segment, which is counter to the philosophy of streaming. We need to dismiss TCP and its error control; we need to use UDP. However, HTTP, which accesses the Web server, and the Web server itself are designed for TCP; we need another server, a media server. Figure 25 below shows the concept.

1. The HTTP client accesses the Web server using a GET message.
2. The information about the metafile comes in the response.
3. The metafile is passed to the media player.
4. The media player uses the URL in the metafile to access the media server to download the file. Downloading can take place by any protocol that uses UDP.
5. The media server responds.

Fourth Approach: Using a Media Server and RTSP
The Real-Time Streaming Protocol (RTSP) is a control protocol designed to add more functionalities to the streaming process. Using RTSP, we can control the playing of audio/video. RTSP is an out-of-band control protocol that is similar to the second connection in FTP. Figure below shows a media server and RTSP.

1. The HTTP client accesses the Web server using a GET message.
2. The information about the metafile comes in the response.
3. The metafile is passed to the media player. 4. The media player sends a SETUP message to create a connection with the media server.
5. The media server responds.
6. The media player sends a PLAY message to start playing (downloading).
7. The audio/video file is downloaded using another protocol that runs over UDP.
8. The connection is broken using the TEARDOWN message.
9. The media server responds.
The media player can send other types of messages. For example, a PAUSE message temporarily stops the downloading; downloading can be resumed with a PLAY message.

STREAMING LIVE AUDIO/VIDEO

* Streaming live audio/video is similar to the broadcasting of audio and video by radio and TV stations. * Instead of broadcasting to the air, the stations broadcast through the Internet. There are several similarities between streaming stored audio/video and streaming live audio/video. * However, there is a difference. In the first application, the communication is unicast and on-demand. In the second, the communication is multicast and live. * However, presently, live streaming is still using TCP and multiple unicasting instead of multicasting. There is still much progress to be made in this area.

REAL-TIME INTERACTIVE AUDIO/VIDEO
In real-time interactive audio/video, people communicate with one another in real time. The Internet phone or voice over IP is an example of this type of application. Video conferencing is another example that allows people to communicate visually and orally.

Characteristics
Before discussing the protocols used in this class of applications, we discuss some characteristics of real-time audio/video communication.

Time Relationship

* Real-time data on a packet-switched network require the preservation of the time relationship between packets of a session. * For example, let us assume that a real-time video server creates live video images and sends them online. The video is digitized and packetized. There are only three packets, and each packet holds 10 s of video information.

Timestamp

* One solution to jitter is the use of a timestamp. If each packet has a timestamp that shows the time it was produced relative to the first (or previous) packet, then the receiver can add this time to the time at which it starts the playback..

Playback Buffer

* To be able to separate the arrival time from the playback time, we need a buffer to store the data until they are played back. The buffer is referred to as a playback buffer. When a session begins (the first bit of the first packet arrives), the receiver delays playing the data until a threshold is reached.

* Data are stored in the buffer at a possibly variable rate, but they are extracted and played back at a fixed rate. Note that the amount of data in the buffer shrinks or expands, but as long as the delay is less than the time to play back the threshold amount of data, there is no jitter.

Ordering

* In addition to time relationship information and timestamps for real-time traffic, one more feature is needed. We need a sequence number for each packet. * The timestamp alone cannot inform the receiver if a packet is lost. For example, suppose the timestamps are 0, 10, and 20. If the second packet is lost, the receiver receives just two packets with timestamps 0 and 20. The receiver assumes that the packet with timestamp 20 is the second packet, produced 20 s after the first. * The receiver has no way of knowing that the second packet has actually been lost. A sequence number to order the packets is needed to handle this situation.

Multicasting

* Multimedia play a primary role in audio and video conferencing. The traffic can be heavy, and the data are distributed using multicasting methods. Conferencing requires two-way communication between receivers and senders.

Translation

* Sometimes real-time traffic needs translation. A translator is a computer that can change the format of a high-bandwidth video signal to a lower-quality narrow bandwidth signal. * This is needed, for example, for a source creating a high-quality video signal at 5 Mbps and sending to a recipient having a bandwidth of less than 1 Mbps. To receive the signal, a translator is needed to decode the signal and encode it again at a lower quality that needs less bandwidth.
Mixing

* If there is more than one source that can send data at the same time (as in a video or audio conference), the traffic is made of multiple streams. * To converge the traffic to one stream, data from different sources can be mixed. A mixer mathematically adds signals coming from different sources to create one single signal.

4. STREAMING LIVE AUDIO/VIDEO

Streaming live audio/video is similar to the broadcasting of audio and video by radio and TV stations. Instead of broadcasting to the air, the stations broadcast through the Internet. There are several similarities between streaming stored audio/video and streaming live audio/video. They are both sensitive to delay; neither can accept retransmission. However, there is a difference. In the first application, the communication is unicast and on-demand. In the second, the communication is multicast and live. Live streaming is better suited to the multicast services of IP and the use of protocols such as UDP and RTP (discussed later). However, presently, live streaming is still using TCP and multiple unicasting instead of multicasting. There is still much progress to be made in this area.

5. REAL-TIME INTERACTIVE AUDIO/VIDEO
In real-time interactive audio/video, people communicate with one another in real time. The Internet phone or voice over IP is an example of this type of application. Video conferencing is another example that allows people to communicate visually and orally.

Characteristics
Before discussing the protocols used in this class of applications, we discuss some characteristics of real-time audio/video communication.

Time Relationship
Real-time data on a packet-switched network require the preservation of the time relationship between packets of a session. For example, let us assume that a real-time video server creates live video images and sends them online. The video is digitized and packetized. There are only three packets, and each packet holds 10 s of video information.
The first packet starts at 00:00:00, the second packet starts at 00:00:10, and the third packet starts at 00:00:20. Also imagine that it takes 1 s (an exaggeration for simplicity) for each packet to reach the destination (equal delay). The receiver can play back the first packet at 00:00:01, the second packet at 00:00:11, and the third packet at 00:00:21.
Although there is a 1-s time difference between what the server sends and what the client sees on the computer screen, the action is happening in real time. The time relationship between the packets is preserved. The 1-s delay is not important. Figure below shows the idea.

But what happens if the packets arrive with different delays? For example, the first packet arrives at 00:00:01 (1-s delay), the second arrives at 00:00:15 (5-s delay), and the third arrives at 00:00:27 (7-s delay). If the receiver starts playing the first packet at 00:00:01, it will finish at 00:00:11. However, the next packet has not yet arrived; it arrives 4 s later. There is a gap between the first and second packets and between the second and the third as the video is viewed at the remote site. This phenomenon is called jitter. Figure below shows the situation.

Timestamp

One solution to jitter is the use of a timestamp. If each packet has a timestamp that shows the time it was produced relative to the first (or previous) packet, then the receiver can add this time to the time at which it starts the playback. In other words, the receiver knows when each packet is to be played. Imagine the first packet in the previous example has a timestamp of 0, the second has a timestamp of 10, and the third a timestamp of 20. If the receiver starts playing back the first packet at 00:00:08, the second will be played at 00:00:18, and the third at 00:00:28. There are no gaps between the packets. Figure below shows the situation.

Playback Buffer

To be able to separate the arrival time from the playback time, we need a buffer to store the data until they are played back. The buffer is referred to as a playback buffer. When a session begins (the first bit of the first packet arrives), the receiver delays playing the data until a threshold is reached. In the previous example, the first bit of the first packet arrives at 00:00:01; the threshold is 7 s, and the playback time is 00:00:08. The threshold is measured in time units of data. The replay does not start until the time units of data are equal to the threshold value.

Data are stored in the buffer at a possibly variable rate, but they are extracted and played back at a fixed rate. Note that the amount of data in the buffer shrinks or expands, but as long as the delay is less than the time to play back the threshold amount of data, there is no jitter. Figure below shows the buffer at different times for our example.

Ordering
In addition to time relationship information and timestamps for real-time traffic, one more feature is needed. We need a sequence number for each packet. The timestamp alone cannot inform the receiver if a packet is lost. For example, suppose the timestamps are 0, 10, and 20. If the second packet is lost, the receiver receives just two packets with timestamps 0 and 20. The receiver assumes that the packet with timestamp 20 is the second packet, produced 20 s after the first. The receiver has no way of knowing that the second packet has actually been lost. A sequence number to order the packets is needed to handle this situation.

Multicasting
Multimedia play a primary role in audio and video conferencing. The traffic can be heavy, and the data are distributed using multicasting methods. Conferencing requires two-way communication between receivers and senders.

Translation
Sometimes real-time traffic needs translation. A translator is a computer that can change the format of a high-bandwidth video signal to a lower-quality narrow bandwidth signal. This is needed, for example, for a source creating a high-quality video signal at 5 Mbps and sending to a recipient having a bandwidth of less than 1 Mbps. To receive the signal, a translator is needed to decode the signal and encode it again at a lower quality that needs less bandwidth.

Mixing
If there is more than one source that can send data at the same time (as in a video or audio conference), the traffic is made of multiple streams. To converge the traffic to one stream, data from different sources can be mixed. A mixer mathematically adds signals coming from different sources to create one single signal.

Support from Transport Layer Protocol
The procedures mentioned in the previous sections can be implemented in the application layer. However, they are so common in real-time applications that implementation in the transport layer protocol is preferable. Let’s see which of the existing transport layers is suitable for this type of traffic. TCP is not suitable for interactive traffic. It has no provision for timestamping, and it does not support multicasting. However, it does provide ordering (sequence numbers). One feature of TCP that makes it particularly unsuitable for interactive traffic is its error control mechanism. In interactive traffic, we cannot allow the retransmission of a lost or corrupted packet. If a packet is lost or corrupted in interactive traffic, it must just be ignored. Retransmission upsets the whole idea of timestamping and playback. Today there is so much redundancy in audio and video signals (even with compression) that we can simply ignore a lost packet. The listener or viewer at the remote site may not even notice it.

UDP is more suitable for interactive multimedia traffic. UDP supports multicasting and has no retransmission strategy. However, UDP has no provision for timestamping, sequencing, or mixing. A new transport protocol, Real Time Transport Protocol (RTP), provides these missing features.
6. RTP (Real-Time Transport Protocol)

RTP is designed to handle real time traffic on the internet. It defines a standardized packet to deliver audio, video and other multimedia over the IP networks.
It is used for mainly for entertainment and communications in which mainly streaming is required.
RTP is used in those areas where the delivery of each and every packet is NOT of that much importance but the timing (delay) is more important. For example let’s consider streaming a video on youtube, where there is an acceptable limit under which the video should get buffered (depending on the bandwidth of the connection, if the bandwidth is higher than the video will get buffered faster otherwise it will take time accordingly).

RTP is used in conjunction with RTCP (Real-time Transport Control Protocol), which is used to monitor transmission statistics and Quality of Service (QoS).
The figure below shows the position of RTP which is between UDP and application layer.

RTP provides certain facilities like timestamping, sequencing and mixing.
Which Protocol to Use?
Now the question arises how to transfer data fast (that is with minimum possible delay) so for that purpose UDP protocol is used with some changes to it.
UDP is used because the transmission of data is fast using this protocol because of the reason that it doesn’t checks the acknowledgment (i.e whether the packet is received by the receiver or not)
Hence with some changes in the design itself UDP is used.

UDP Port
RTP is although a protocol itself but it is treated like application program so rather than encapsulating it directly with IP datagram it is encapsulated in a UDP user datagram.
The port number must be an even number because the odd number is used by its companion RTCP.

7. RTP Packet Format

Ver : This 2-bit field defines the version number. The current version is 2.

P. This 1-bit field, if set to 1, indicates the presence of padding at the end of the packet. In this case, the value of the last byte in the padding defines the length of the padding. Padding is the norm if a packet is encrypted. There is no padding if the value of the P field is 0.

X. This 1-bit field, if set to 1, indicates an extra extension header between the basic header and the data. There is no extra extension header if the value of this field is 0.

Contributor count. This 4-bit field indicates the number of contributors. Note that we can have a maximum of 15 contributors because a 4-bit field only allows a number between 0 and 15.

M. This 1-bit field is a marker used by the application to indicate, for example, the end of its data.

Payload type. This 7-bit field indicates the type of the payload.

Sequence number: This field is 16 bits in length. It is used to number the RTP packets. The sequence number of the first packet is chosen randomly; it is incremented by 1 for each subsequent packet. The sequence number is used by the receiver to detect lost or out of order packets.

Timestamp: This is a 32-bit field that indicates the time relationship between packets. The timestamp for the first packet is a random number. For each succeeding packet, the value is the sum of the preceding timestamp plus the time the first byte is produced (sampled). The value of the clock tick depends on the application. For example, audio applications normally generate chunks of 160 bytes; the clock tick for this application is 160. The timestamp for this application increases 160 for each RTP packet.

Synchronization source identifier: If there is only one source, this 32-bit field defines the source. However, if there are several sources, the mixer is the synchronization source and the other sources are contributors. The value of the source identifier is a random number chosen by the source. The protocol provides a strategy in case of conflict (two sources start with the same sequence number).

Contributor identifier: Each of these 32-bit identifiers (a maximum of 15) defines a source. When there is more than one source in a session, the mixer is the synchronization source and the remaining sources are the contributors

8. Real-time Transport Control Protocol

RTCP is used to control the flow and QoS and allow the recipient to send feedback. It partners with RTP in delivery and packaging of multimedia data, but does not transports any media streams itself. It consists of 5 type of messages.

Sender Report: The sender report is sent periodically by the active senders in a conference to report transmission and reception statistics for all RTP packets sent during the interval. The sender report includes an absolute timestamp. Which allows the receiver to synchronize messages.

Receiver Report: The receiver report is for passive participants, those that do not send RTP packets. The report informs the sender and other receivers about the quality of service.
Source Description Message: The source periodically sends a source description message to give additional information about itself. This information can be the name, e-mail address, telephone number, and address of the owner or controller of the source.
Bye Message: A source sends a bye message to shut down a stream. It allows the source to announce that it is leaving the conference. Although other sources can detect the absence of a source, this message is a direct announcement. It is also very useful to a mixer.
Application-Specific Message: The application-specific message is a packet for an application that wants to use new applications (not defined in the standard). It allows the definition of a new message type.

UDP Port
The UDP port chosen must be next number of port chosen for RTP.
9. VOICE OVER IP
It allows the communication between two parties over the packet-switched network instead of circuit switched network.
Two Protocols have been designed to handle this kind of communication: SIP and H.323
SIP (Session Initiation Protocol)
It is an application layer protocol that establishes, manages, and terminates a multimedia session (call). It can be used to create two-party, multiparty, or multicast sessions. SIP is designed to be independent of the underlying transport layer; it can run on either UDP, TCP, or SCTP.

The caller initializes a session with the INVITE message. After the callee answers the call, the caller sends an ACK message for confirmation. The BYE message terminates a session. The OPTIONS message queries a machine about its capabilities. The CANCEL message cancels an already started initialization process. The REGISTER message makes a connection when the callee is not available.

Simple Session

It includes establishing a session, communicating and terminating the session.
Tracking the Callee
SIP has a mechanism (similar to one in DNS) that finds the IP address of the terminal at which the callee is sitting. To perform this tracking, SIP uses the concept of registration. SIP defines some servers as registrars. At any moment a user is registered with at least one registrar server; this server knows the IP address of the callee.
The process is as follows

H.323
H.323 is a standard designed by ITU to allow telephones on the public telephone network to talk to computers (called terminals in H.323) connected to the Internet.

A gateway connects the Internet to the telephone network. The gatekeeper server on the local area network plays the role of the registrar server, as we discussed in the SIP protocol.

H.323 uses G.71 or G.723.1 for compression. It uses a protocol named H.245 which allows the parties to negotiate the compression method. Protocol Q.931 is used for establishing and terminating connections. Another protocol called H.225, or RAS (Registration/Administration/Status), is used for registration with the gatekeeper.

Operations

Reference:
1. Behrouz A. Forouzan, “Multimedia,” in TCP/IP Protocol Suite, 3rd ed. New Delhi, India , Tata McGraw-Hill Publishing Company Limited, 2007.

Similar Documents

Free Essay

Multimedia

...Computer-delivered electronic system that allows the user to control, combine, and manipulate different types of media, such as text, audio, video, graphic and animation. The most common multimedia machine consists of a personal computer with a sound card, modem, digital speaker unit, and CD-ROM. Long time as the future revolution in computing, multimedia applications were, until the mid-90s, uncommon due to the expensive hardware required. With increases in performance and decreases in price, however, multimedia is now commonplace. Nearly all PCs are capable of displaying video, though the resolution available depends on the power of the computer's video adapter and CPU. One of the multimedia elements is audio. Audio include music, speech ,or any other sound. Audio and digital music are different from other multimedia elements. This is because the audio and digital music is not related to visual .Audio cannot be seen but can only be heard with the ears only. Computer will result in vibration of air through the speakers or sound box, while our ears detect vibrations in the air. As the sound wave moves away from the vibrating object, the individual molecules do not advance with the wave, they vibrate at what is termed their average resting place until their motion still or they are set in motion are the component that make up a sound wave, frequency, amplitude, velocity, wavelenght, and phrase. Before the continuous wave analog audio can be stored in a computer...

Words: 608 - Pages: 3

Free Essay

Multimedia

...Multimedia refers to any application that combined several elements of multimedia such as text, graphics, animation, audio and video. A combination of text, audio, animation, video, images/graphics, and interactivity define multimedia. Each appeals to a human sense. The degree and manner to which each is used has the potential to increase learning. The mix of video, audio, and high quality graphics afforded by multimedia may grab the attention of students, particularly poor readers, in ways that traditional approaches to instruction would not (Kenworthy, 1993). In multimedia environments, learners construct meaningful knowledge by "selecting words and selecting images from the presented material, organizing words and organizing images into coherent mental representations, and integrating the resulting verbal and visual representations with one another" (Mayer, 1997, p. 4). Sound The sound is used to provide emphasis or highlight the transition from one page to another. Sound synchronization screen, so that teachers can submit a lot of information. Creative use of sounds to become imaginative stimulus; improper use, it has become an obstacle or an annoyance. The distance between the two properties amount - the acoustic frequency in each peak height of these wave patterns - (sometimes referred to as pitch) peaks. The greater the distance, low voice. Representatives, through video visualization capabilities can be direct and powerful. While this is no doubt, it is how we see...

Words: 1906 - Pages: 8

Free Essay

Management with Multimedia

...Lesson 1-What is Multimedia? Overview Introduction to multimedia. Applications of multimedia. Virtual reality (VR). Introduction to Multimedia Multimedia is a combination of text, art, sound, animation, and video. Text Graphic/ Image Animation Video Audio/ Sound It is delivered to the user by electronic or digitally manipulated means. A multimedia project development requires creative, technical, organizational, and business skills. Introduction to Multimedia Multimedia becomes interactive multimedia when a user is given the option of controlling the elements. Interactive multimedia is called hypermedia when a user is provided a structure of linked elements for navigation. Multimedia developers develop multimedia projects. Introduction to Multimedia The software vehicle, the messages, and the content together constitute a multimedia project. A multimedia project shipped to end-users with or without instructions is called a multimedia title. A project can be Stand-alone or launched on the Web. Introduction to Multimedia Multimedia projects can be linear or nonlinear. Projects that are not interactive are called linear. Example: Power Point Slides, Watching a movie, listening to a story etc., Projects where users are given navigational control are called non-linear and user-interactive. Introduction to Multimedia Authoring tools are used to merge multimedia elements into a project. The software used for building multimedia content. Example:...

Words: 542 - Pages: 3

Free Essay

Multimedia and Graphics

...MULTIMEDIA & GRAPHICS BSc Computing & Business Information Systems INTRODUCTION Suppose you had created a new computer program to display images of geological microscope slides interactively. Your program would offer people without a microscope the chance to examine rock samples under different lighting, rotate them and take measurements How would you provide the people who were going to use your program with instructions about how to operate it? MULTIMEDIA AND GRAPHICS INTRODUCTION Information can be conveyed in the form of text, still images, Web pages, slideshow presentations, video, sound or interactive tooltips MULTIMEDIA AND GRAPHICS INTRODUCTION 1. 2. Write a tutorial and a reference manual Make use of screenshots and other illustrations MULTIMEDIA AND GRAPHICS INTRODUCTION 3. Create an instructive Web site 4. Prepare a PowerPoint slide presentation, video and use a tool tip MULTIMEDIA AND GRAPHICS MEDIA Different media – text, image, sound, video and so on Distinctive characteristics of media  Time-based  Static media MULTIMEDIA AND GRAPHICS MEDIA Time-based media  Exhibit some change over time  e.g., video, animation and sound  Presentation of media are usually supplied with player controls: start, stop and pause Static media  Do not exhibit any change over time  Still images and text MULTIMEDIA AND GRAPHICS WHAT IS MULTIMEDIA? Media may be combined into multimedia Multimedia is defined...

Words: 742 - Pages: 3

Free Essay

Multimedia Principle

...Basic Principles of Multimedia Design and Development find what they want. The best Milestones - Analysis Report, navigation is highly intuitive. Often Outline, Statement of Work web pages provide alternate paths Design Phase (e.g. - buttons, image maps, • Begin Design (Interface, hypertext.) Navigation, Graphics, Text Textual content should be Treatment) designed for accessibility and readability. Depending on the kind • Draft Flowcharts and of project, text may also need to be Storyboards printable and changeable. Detailing what it takes to do it right, During design it is also important to • Review and make any necessary revisions to designs, etc. this article describes some basic consider technology issues, such as: design principles, delineates the • Define and Build Rapid bandwidth, throughput, system phases of development, Prototype(s) requirements, etc. demonstrates appropriate analysis • Conduct Test of Rapid and defines some basic multimedia Development Prototype(s) with typical end terms. Here are the activities involved users during each phase of multimedia As the line between disk and webbased projects continue to blur, it becomes easier to communicate effectively using either or both. Capture and hold attention with multimedia. Combine audio, video and animated graphics with the written word, to deliver interesting, entertaining and compelling messages. Design Interactive design is probably the most exciting feature...

Words: 1458 - Pages: 6

Free Essay

Stages of Multimedia Project

...List and briefly discuss the stages of a multimedia project. Be sure to define the milestones that mark the completion of the phase. Firstly, you must develop a sense of its scope and content, having a rough idea in your head on how to design your project. Then I have to develop an organized outline and a plan that is rational in terms of the skills, time, budget, tools, and resources that I have in hand. Scheduling, once I have worked out with my plan that include the phases, tasks and work items that I feel will be required. I have to layout these elements along a timeline. This will usually include milestones at which certain deliverables are to be done. To create a schedule I must estimate the total time required for each task. After scheduling, estimating the cost of the project will be relevant to estimate the entire cost after the project has been completed. It is relatively simple matter to estimate cost and effort, For example, the manufacturing industry. Having a progressive accounts and billing will be saved to review the financial part of the project, for example preparing a cost sheet. Finally, write and structure the elements of the multimedia project. Potential clients do not have a clue about how to make multimedia, but they do have a vision on their project. As a project designer, I will know what, my clients want and how to satisfy them. I occasionally may encounter a more formal request of proposal. This are typically detailed documents from large corporation...

Words: 274 - Pages: 2

Free Essay

Multimedia Content

...Multimedia content is something that is presented thru video or audio image. It could be a video presentation or and image with sound in it. There are a lot of formats available to develop all sorts of multimedia content. Many of them are implemented to the current websites. Most of companies that advertise on the internet attach promotion videos or images of their products in order to be able to display them to the customer in the best possible way. Multimedia elements are necessary to deliver the best user experience which could often be the selling point. Multimedia elements can be either a still image or animation which is made of a series or still images. Another multimedia components can be a video footage captured with a camera. Interactivity in multimedia is when then media receives input from the user as form of communication and then interpreters that information to process calculations that create an output to be displayed back to the user. The two way communication between the user and media content allows the developers to create more advanced and user friendly interfaces. Major developments in multimedia technology was the release of laser disk technology that enabled the user to access high quality analogue image on the computer screen. The fall of the hardware costs made these technologies more widely available which led to massive expansion of the internet and video games industry that allowed this industry to develop rapidly and create competition to television...

Words: 716 - Pages: 3

Free Essay

Introduction to Multimedia Systems

...Introduction to Multimedia Systems This Page Intentionally Left Blank Introduction to Multimedia Systems Editors Gaurav Bhatnagar Shikha Mehta Sugata Mitra Centre for Research in Cognitive Systems (CRCS) NIITUd. New Delhi, India ACADEMIC PRESS A Harcourt Science and Technology Company San Diego San Francisco New York Boston London Sydney Tokyo Cover art: © 2001 John Foxx Images This book is printed on acid-free paper, w Copyright © 2002 by ACADEMIC PRESS All Rights Reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Requests for permission to make copies of any part of the work should be mailed to: Permissions Department, Harcourt Inc., 6277 Sea Harbor Drive, Orlando, Florida 32887-6777 Explicit permission from Academic Press is not required to reproduce a maximum of two figures or tables from an Academic Press chapter in another scientific or research publication provided that the material has not been credited to another source and that full credit to the Academic Press chapter is given. Academic Press A division of Harcourt, Inc. 525 B Street, Suite 1900, San Diego, Cahfomia 92101-4495, USA http://www.academicpress.com Academic Press Harcourt Place, 32 Jamestown Road, London NWl 7BY, UK http ://www. academicpress...

Words: 60427 - Pages: 242

Premium Essay

Aspects of Interactivity, Hypertext and Hypermedia, and Social Media: Facebook, Friendster, Blogs and Second Orality, Construction of Reality and Multimedia and Ideology Synopsis Paper

...Aspects of Interactivity, Hypertext and Hypermedia, and Social Media: Facebook, Friendster, Blogs and Second Orality, Construction of Reality and Multimedia and Ideology Synopsis Paper by: 2000-54307 ARADA, Blancaflor P. The Media Construction of Reality As I read this article, I remembered the Manila hostage crisis last August 23, 2010. Because Philippine media wanted to report factual details and information, the news network became too involved with the hostage crisis. Based on Wikipedia entry, “TV5 news anchor Erwin Tulfo remained in permanent contact with Mendoza, while superintendent Orlando Yebra and chief inspector Romeo Salvador led the negotiations. By this time, several major television channels in Manila and Hong Kong had replaced their programmes with non-stop live coverage of the hostage situation, and live footage became available worldwide. The news networks were allowed to film police activity, and as the bus was equipped with a television, the gunman was able to watch and find out what the police were doing, and was even able to find the locations of snipers” (Wikipedia, 2013). What made Mendoza snapped out was when his brother and son were taken away by the police in the live television coverage. This resulted to a failed hostage rescue attempt because eight (8) hostages were killed. It may be the Philippine National Police’s fault, or Mayor Alfredo Lim’s or Vice-Mayor Isko Moreno’s, or even President Benigno “Noynoy” Aquino III’s, but there is no doubt that...

Words: 3881 - Pages: 16

Premium Essay

Ethical Challenges Surrounding the Movie and Music Industries

...and software that you haven't paid for. Not paying people for their creative work isn't just an ethical issue, it's illegal. Under U.S. copyright law, offenders could be punishable by up to five years in prison and $250,000 in fines. We need to educate everyone on these issues and make sure you are downloading from authentic software from authorized sources to avoid pitfalls of piracy. We also need to lead by example with our kids today by letting them know cheap and easy downloads are not always better, they are usually illegal and can be harmful to your computer. How has education affected your views about the practice of downloading or copying multimedia content in these fields? Education has affected my view about the practice of downloading or copying multimedia content in these fields because downloading or copying multimedia content for free can cost the music or movie industry millions of dollars and the people who work or are involved in the industry like movie rental places and movie theaters could cost people jobs or put them out of business. Theoretically it has a negative effect on the whole economy. Illegal Downloads: When Sharing Becomes Stealing by Caroline...

Words: 296 - Pages: 2

Free Essay

Multimedia

...Multimedia, or mixed-media, systems offer presentations that integrate effects existing in a variety of formats, including text, graphics, animation, audio, and video. Such presentations first became commercially available in very primitive form in the early 1980s, as a result of advances that have been made in digital compression technology-- particularly the difficult area of image compression. Multimedia online services are obtainable through telephone/computer or television links, multimedia hardware and software exist for personal computers, networks, the internet, interactive kiosks and multimedia presentations are available on CD-ROMs and various other mediums. The use of multimedia in our society has it benefits and it’s drawbacks, most defiantly. Some of the more computer-related uses of multimedia, such as electronic publishing, the internet, and computers in education will be discussed in depth thought this paper. Electronic publishing is the publishing of material in a computer-accessible medium, such as on a CD-ROM or on the Internet. In a broader sense of the term it could also include paper products published with the aid of a desktop publishing program, or any form of printing that involves the use of a computer. Reference works became available in the mid-1980s both in CD-ROM format and online. Increasingly, in the 1990s, magazines, journals, books, and newspapers have become available in an electronic format, and some are ...

Words: 1938 - Pages: 8

Free Essay

Multimedia

...Nanotechnology and its process on computing: Nano and technology: * a Nanometre is a unit of length in the metric system,equal to one billion of ametre(10-9). * Technology is the making usage and knowledge and tools,techinques and machines,in order to solve a problem or to perform specific function | |  “ Nanotechnology is the art and science of manipulating in a nanoscale” Nanotechnology in computing: Computing includes designing, developing and building hardware and software systems; processing, structuring, and managing various kinds of information; doing scientific research on and with computers; making computer systems behave intelligently; and creating and using communications and entertainment media. Nanocomputing:”A nanocomputer is a computer whose physical dimensions are microscopic. The field of nanocomputing is part of the emerging field of nanotechnology . Several types of nanocomputers have been suggested or proposed by researchers and futurists.” Nanocomputing, as defined in this report, refers to computing systems which are constructed from nanoscale compo- nents. The issues that need to be faced for successful realization of nanocomputers relate to the scale and integration of the components. nanotechnology and its type: Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. Most engineers agree that technology has not yet come close to pushing this limit. By 1970s standards, today's ordinary microprocessors...

Words: 2254 - Pages: 10

Premium Essay

Nt1310 Unit 1 Assignment 1

...3GP is a required file format for video and associated speech/audio media types and timed text. Multimedia Messaging Service (MMS). This file formats used to be real popular on the older mobile phones the quality wasn’t that good but they didn’t take up much space. Streaming Streaming allows the user to use some sort of multimedia without actually having to download the file you can just listen or watch a video without having to download as a file to you device or computer. Streaming also depends on the users bandwidth either it will be able to cope with the video, it the connection is unable to keep up with the video it will result in buffering and possibly not loading the video/song at all Progressive download Progressive downloads delivers a file through Http, this saves a temporary copy to the computer so that the user can watch the piece of media while it is currently downloading. Once downloaded the user wont need to download this file again...

Words: 3437 - Pages: 14

Premium Essay

My Cv

...Rana Maher AbdAllah Fikry 6th of October City, Giza, Egypt Mob.: (+20) 01272317644 Date of birth: 6/8/1984 Email: rana84maher@gmail.com Nationality: Egyptian Religion: Muslim Marital Status: Single EDUCATION AND QUALIFICATIONS 2015- Current Studying EMBA- Nile University 2001– 2005 Faculty of Arts- Mansoura University Bachelor of Arts- English Department Degree: Good 1999 - 2001 Zahraa Islamic Language Schools CERTIFICATES June 2015 Service Management in Microsoft Dynamics CRM 2013 June 2015 Customer Service in Microsoft Dynamics CRM 2013 Feb 2014 Business System Analysts Reqmaster- IIBA Cairo Chapter May 2015 ITIL Foundation Feb 2014 Project Management Preparation (PMP) Aug 2013 Mini MBA- Management Principles Native Egyptians November 2005 CompTIA A+ Certification (Hardware & Operating System Maintenance) Started in September 2005 to November 2005 which is cooperation venture between MCIT (Ministry of Communication and Information) and the IBM (International Business Machines). WORK EXPERIENCE 2010- Current Telecom Egypt  Mar. 2013- Current Head of Technical and Systems Support Division Business Analyst in Business Process Automation and Invoicing Applications Dept. (Smart Village, Egypt), where my tasks are: 1                   Define and document business processes and software requirements. Eliciting requirements by discovering the underlying...

Words: 843 - Pages: 4

Premium Essay

Social Networking

...Design of Internet-Based News Delivery Systems and Its Impact on Society YuriQuintana GraduateSchoolofLibraryandInformationScience UniversityofWesternOntario London, Ontario, Canada, N6G1H1 Tel.+1519679-2111ext.8500 Fax+1519661-3506 http://www.newmedia.slis.uwo.ca/yuri/ Abstract This paper presents an overview of emerging interactive multimedia technologies and how they can be used to deliver news on the Internet. A set of design principles for designing interactive multimedia news systems are given that include factors such as the effective use of navigational aids, design of menus, presentation styles, and effective use of media. Examples of effective designs and implementation of multimedia news on the Internet are also given. The impact and benefits of multimedia news on society are also discussed with examples. The paper concludes with some possible designs for future news delivery systems. 1.0 Introduction The news industry is currently undergoing major transformations as a result both of the growing popularity of the Internet itself and of advances in interactive multimedia technologies for the Internet. The types of news sources available on the Internet include newspapers, news wires, cable television, news magazines, and radio stations. New technologies for the Internet include animations, direct manipulation of graphical interfaces, and real time on-demand audio and video. The shift from paper to electronic delivery of news occurred almost simultaneously...

Words: 5557 - Pages: 23