Free AIOU Solved Assignment Code 8622 Spring 2021

Free AIOU Solved Assignment Code 8622 Spring 2021

Download Aiou solved assignment 2021 free autumn/spring, aiou updates solved assignments. Get free AIOU All Level Assignment from aiousolvedassignment.

Course: Non Broad Cast Media (8622)
Semester: Spring, 2021
ASSIGNMENT No. 1

online universities in usa > College Scholarships > Scholarship Application Strategies > Apply for Scholarships

Q.1 Explain the nature and scope of non-broadcast media. Support your answer with examples.

The present day world is facing two general problems-“information explosions” and the “population explosion”. Information explosion means an explosion of knowledge. Today, throughout the world, social and technological changes are taking place rapidly due to expanding world of information. So there is explosion of knowledge. New frontiers of knowledge are opening day by day and the horizon of human knowledge and understanding is expanding very fast.

On the other hand, with the explosion of knowledge there is also population explosion. The student population is immensely increasing year by year due to the growth of population and democratisation of education with varying levels of motivation and aspiration. The problem of” population explosion” is more serious in the developing countries than the developed one. India is facing serious difficulties both from population as well as information explosion.

So the two general factors – “information explosion” and ” Population explosion” have posed critical problems for education-more things to be learnt and more people to be taught. Today there is a cry for “more education to more people in less time”. For solving these problems successfully, educational technology consisting of various media of mass communication are essentially required. Both qualitative improvement and quantitative expansion of education can be facilitated and accelerated with the help of this mass media under educational technology. So the mass-media has come to our rescue to tackle this problem.

Edison was also one of the first to produce films for the classroom. Many colleges and universities experimented with educational film production before World War I, and training films were used extensively during the war to educate a diverse and often illiterate population of soldiers in a range of topics from fighting technique to personal hygiene. Improvements in filmmaking, in particular the ability to produce “talkies,” were put to use just before and during World War II for technical training and propaganda purposes. While the most artistically acclaimed propaganda production may have been Triumph of the Will (1935), one of a series of films made by Leni Riefenstahl during the 1930s for the German Nazi government, similar films were produced by all the major belligerents. In the United States the army commissioned Hollywood film director Frank Capra to produce seven films, the widely acclaimed series Why We Fight (1942–45), in order to educate American soldiers on what was at stake.

Instructional television courses began to be developed in the 1950s, first at the University of Iowa. By the 1970s community colleges all across the United States had created courses for broadcast on local television stations. Various experiments in computer-based education also began in the 1950s, such as programmed or computer-assisted instruction, in which computers are used to present learning materials consisting of text, audio, and video and to evaluate students’ progress. Much of the early research was conducted at IBM, where the latest theories in cognitive science were incorporated in the application of educational technology. The next major advancement in educational technology came with the linking of computers through the Internet, which enabled the development of modern distance learning.

Broadcast media is what its name implies. It is media that is cast broadly and freely for anyone to access live. Radio and free-to-air TV are probably the most common examples.
It is all media that is printed in hardcopy like paper in form. The most common examples are newspapers and magazines. Mass media is media of whatever type: broadcast, print and digital that reaches large numbers of people. Includeing media such as pay TV, and Internet based news services, as well as the traditional media in the category of what its about.

Advantages and Disadvantages :

  • the advantages are it moves very quickly and has no delay so you wouldn’t have to wait hours for what your waiting for to come
  • any programs or music will always be of recent and ready to use
  • it is also very trendy and informational for fashion people , T.v channels and even contact details
  • Its disadvantage would be
  • it could be illegal meaning many users would not b aloud to use it or may be licensed £140
  • Hear are examples of some broadcast media

you-tube is id say is the most popular broadcast media in many ways cause it requires a lot of information on it

  • you can find music on it
  • watch the television
  • and also surf on it

but they all have their own way of being published.

Non Broadcast Media

Non Broadcast audio is completely the opposite of broadcast (Obviously!) The main powers that non broadcast material can have, is that it enables you to pause, stop, rewind and fast forward video and, or audio as and when you feel like or if there is a need too. As non broadcast is not live or a continuous stream of media, it can also be played at any time of any day etc. Some examples of this can be videos on YouTube. You are able to view freely and leisurely as and when you please. The same goes for BBC iPlayer or your music on iTunes. Non broadcast can also be playing a CD, Listening to your iPod or MP3 player, or even watching a dvd on your TV. all of these examples mentioned are linked with each other because they can all come to a halt at any time and are not broadcasted as a continuous stream. where this is from

Advantages & Disadvantage

  • the advantages are it has no TV licence
  • you can access it when you want
  • it is also potentially better quality ( more than one colour not black and white )
  • more accessible
  • your also in control which is important and even more exciting because you can do many things with it

The difference between Broadcast, and Non Broadcast, is that one has satellites and transmission masts to produce good media material that is only controlled by certain people ( not everyone ) and has a few features to the public, where as the other may want more internet connection for iPlayer and many more and we are able to locate and access the material, if it was to be pause, rewind, fast forward or to even stop.

AIOU Solved Assignment Code 8622 Spring 2021

Q.2 Critically examine the steps involved in the choice of media.

Distance education is often called online learning because Internet-connected
computers are the primary delivery vehicles that bring together teacher and learner.
This connection implies the replacement of face-to-face instruction that has existed
since the beginning of time. It is understandable then that distance education often
mimics face-to-face learning. The availability of contemporary technological tools
creates opportunities for teachers to engage learners without directly facing them and, at the same time, to enhance the process.

Print

One of the major distinctions in the history of distance learning has been its medium of delivery. Some of the early programs were delivered primarily in print and are often referred to as correspondence courses. Correspondence study was conducted largely through the mail. The instructional media were books and other printed materials. The papers that passed from teacher to learner and vice versa provided the interaction.

Today, the most common medium for learning at a distance is still paper-books, study guides, and bibliographies–while it may not be as glamorous as some of the colorful computer-based graphic resources.

Radio and Telephone

Another “old-timer” is radio. There are many examples of using radio for teaching and learning. Radio is a synchronous medium; that is, all learners have to be listening at the same time even though they are in different locations. Later, radio learning was enhanced by telephone conference calls during or after the initial audio presentation. Instruction by both telephone and radio usually incorporated printed materials as part of the delivery system.

Audiotapes and Television

Still later, disc recordings and recorded tapes offered an extension of radio and
telephone communication. With the advent of audiotape, radio programs could be
recorded and sent to learners, who could then choose the time and place to listen and respond to the materials presented in the audiotapes. When broadcast television became available, complete courses were offered (often at early morning hours) with supplemental materials, such as printed texts and audiotapes.

Each new delivery vehicle often absorbed support media from previous systems. Each communication vehicle was the framework that permitted interaction between teacher and student, thus validating each approach as a delivery system. These approaches retained the feeling and experience of most traditional face-to-face classes. Other variations, such as complete courses on audiotape or videotape, followed and incorporated some of the earlier media and interactive procedures between distance teachers and learners. Closed circuit television offered still another approach. Lessons were offered simultaneously to students in remote locations, such as a university campus or individual school buildings in a school system.

Computer-based

Current distance learning programs are increasingly relying on computer technologies but still use traditional media as resources for effective learning. These media are
relatively inexpensive and can reach many individuals who prefer to study whenever
and wherever they wish. The downside of one-way media use, with the exception of the telephone, is that interaction is limited and feedback is often delayed because of slow postal systems that deliver both study materials and responses to learner papers.

Nevertheless, these media are often part of the delivery system package even as
computer-based distance education continues to grow.

The computer has changed the traditional offerings of distance education. The term,
online learning, creates a new orientation for teachers and learners. It retains many of the characteristics of earlier forms of distance learning while offering more sophisticated media resources as integral components of the learning process. Even though many of the resources are the same in content, they have become an essential part of computer-delivered programs. It is possible to provide charts, graphs, maps, slides, moving images, and audio recordings with the study guide that helps to organize and deliver an entire course. It is the interaction between instructor and student in distance settings that requires communication on a person-to-person basis. This interaction uses e-mail, telephone conference calls and “chat” functions in computer programs to substitute for the face-to-face experiences of earlier times. The original distance education by correspondence has been upgraded by Twenty-first Century technology.

The primary question stemming from these new developments is: “Do students learn as well at a distance using contemporary technologies as they do when attending a face-to-face class?” Many studies regarding this question have been conducted and most research findings show that there is no significant difference between learning at a distance and face-to-face classroom learning. This finding applies to all age groups in almost every setting (Simonson, Smaldino, Albright and Zvacek, 2003; Gunawardena and McIsaac, 2003). If these findings are true, even most of the time, what are the implications for selecting media for teaching at a distance?

SELECTION OF APPROPRIATE MEDIA

The process of selecting media for learning at a distance is, in most cases, the same (or nearly the same) as media selected for face-to-face teaching and learning. Delivery of media online offers easy access for students who are located at home, in a place of work or using computer access points in schools and libraries. Selecting media for distance education begins with consideration of course (or unit) objectives as a starting point. If learning can be facilitated by seeing, hearing or using manipulative media, which medium or media should be used to achieve the objectives and how will it be delivered? Can it be integrated with an online course management system (such as Blackboard, click2learn or WebCT) or should it be separate for use in conjunction with printed handouts and online guidance? Some distance courses provide kits of media that are used off-line. Examples are science laboratory kits, audio lectures, and packets of manipulative materials.

Each medium should pass certain tests before incorporating it into the distance learning scheme. Will the learner have access to the medium at home, work or in a community setting? Does the access include the necessary software? Can the cost of the material be justified, that is, is it cost effective for the instructor to produce and for the students to acquire? Is the resource essential or just “nice to have”? Again, think about cost to the student and the extent to which it will enhance achievement of the learning objectives.

Is there an alternative medium that could achieve the same objective? (For example,
sometimes printed materials instead of audiovisual media will suffice.) Will it be
delivered as an integral part of a course management system or as a separate item?
(For example, do students have separate means to use CD-ROM, floppy disc,
videotape, audiotape, slides or manipulative materials?) Will interaction be handled by e-mail, online discussion groups, telephone (individual or conference calls), infrequent face-to-face meetings, or postal correspondence?

One emerging trend is the hybrid approach to teaching and learning at a distance.
There are many opportunities for creating hybrid distance education. One of the most common designs is online with both face-to-face and telephone conference calls within a single course. Hybrid courses do not change the decisions about media to be used, but they do require new instructional designs. Hybrid distance education usually facilitates interactivity among students and between the instructor and students. One example is a hybrid course that is offered primarily online, with students meeting face-to-face at the beginning of the course (or at mid-point) and the instructor initiating a conference call with groups every other week. This approach helps to replicate the social element of traditional courses and student conversations before and after classes and in the coffee shop. When the social factor is included using some of these techniques, there are fewer concerns about student isolation, which can otherwise be a fairly frequent complaint of distance education courses.

AIOU Solved Assignment 1 Code 8622 Spring 2021

Q.3 Explain the significance of instructional media in education.

Instructional materials are the content or information conveyed within a course. These include the lectures, readings, textbooks, multimedia components, and other resources in a course. These materials can be used in both face-to-face and online classrooms; however, some must be modified or redesigned to be effective for the online environment. The best instructional materials are aligned with all other elements in the course, including the learning objectives, assessments, and activities. Instructional materials provide the core information that students will experience, learn, and apply during a course. They hold the power to either engage or demotivate students. This is especially true for online courses, which rely on a thoughtful and complete collection of instructional materials that students will access, explore, absorb, and reference as they proceed in a course.

Therefore, such materials must be carefully planned, selected, organized, refined, and used in a course for the maximum effect. The planning and selection of instructional materials should take into consideration both the breadth and depth of content so that student learning is optimized.

Common Instructional Content Types Examples Resources/Tips
Print Materials: Readings, Syllabus, Lesson/Assignment Files, Rubrics, Handouts ·         Assignment

·         Rubrics

·         Discussion Prompt

·         Create accessible course materials

·         Develop instructions using the online activity worksheet.

Digital Media/Recorded Lectures (Audio or Video): Movies/TV Clips/ YouTube, PodcastsScreencastsTEDx Talks, etc. ·         Plant Pathology 123: The Irish Potato Famine: courtesy of Professor Aurelie Rakotondrafara. Produced with PowerPoint and Articulate Storyline

·         Screencasts: Statistical Programming Experience: courtesy of  Professor John Gillett, Produced with CaptureSpace Lite.

·         Narrated PowerPoint using Camtasia courtesy of instructor Lisa Lenertz-Lindemer, Environment, Health, & Safety

·         Motivation YouTube Video

·         TED Talks

·         Podcasts

·         Screencast information, resources, and more examples

·         How to use CaptureSpace Lite to upload video, create a screencast, and record audio record

Course Introduction Video ·

o    Course Introduction Video  courtesy of Professor Dietram A.Scheufele

o    Good and Bad Examples of Course Welcome

Tips to create an introduction video

 

Presentation Materials: Lecture Notes, PowerPoint, Prezi, Adobe Captivate) ·         Prezi Example: John Hawks – Intro to Anthropology Course

o    (Tip: use the arrows to navigate forward and backward)

·         Adobe Captivate Examples:

o    Activity: Match the skill to the correct level of Bloom’s Taxonomy.

o    Activity: How do you define assessment?

·         How to create effective eLearning presentations

·         PowerPoint for E-Learning

Expert Interviews, Guest Speaker Recordings ·         Video Example: John Hawks – Tour of Gibraltar caves to explore Neandertal behavior for Human Evolution: Past and Future MOOC.

·         Audio Example: Kris Olds – Interview with Nigel Thrift for Globalizing Higher Education and Research MOOC.

·         Pedagogical Roles for Video in Online Learning

·         Develop Your Video Presence

·         Tips for instructional design for videos

Case Studies/Scenarios ·         Articulate storyline example courtesy of the Physical Therapy Department. Produced by the DoIT Academic Technology Online Course Production Team. ·         UW-Madison content authoring pilot technologies

·         Writing case studies

Educational Games ·         Civics

·         Games for Change

7 things you should know about games and learning
Simulations ·         Diffusion Simulation Game

·         Tax Simulation

Uses, trends & implications for simulation technologies in education
Visualizations: Illustrative Pictures, Graphics, Interactive Data ·         Word Clouds

·         Infographic

·         Tips for using word clouds in eLearning

o    for reflection and synthesis

o    to enhance critical thinking

·         Tips for using infographics

Third Party Tools and Software ·         Diigo Example

·         Diigo Outliner Example

·         PowToon Example

·         Over 100 third-party tools and services

o    for assessment, content, collaboration, & interaction

·         Tools include:

o    Diigo

o    Diigo Outliner

o    PowToon

Role Playing Thiagi’s Training Games ·         5 ways to use role-playing

·         Role playing ideas and resources

·         Role playing assignment

Student-Created Content For the most part any of the other content types can also be created by students as an assignment and then could be used as examples in your course. Tips for adding student-generated content
Expert Blogs The Rapid E-Learning Blog is a great resource for building learning. 7 things you should know about blogs
Open Educational Resources (OER): Textbooks, Online Articles, Audio or Video Clips, Links to Online Resources, Databases, Examples; Simulations OERs to explore

online universities in usa > College Scholarships > Scholarship Application Strategies > Apply for Scholarships

 

Integrating OERs in teaching and learning
Websites/Really Simple Syndication (RSS) feeds Website: EDUCAUSE® is a nonprofit association committed to advancing higher education. ·         7 things you should know about RSS

·         Placing RSS feeds into D2L using a widget (includes examples)

·         How to add an RSS feed into Moodle

·         How to add RSS feed to a Canvas announcement

Software & Topical Training Lynda.com is an online training library of video tutorials that is available for free to UW-Madison staff and students.

AIOU Solved Assignment 2 Code 8622 Spring 2021

Q.4 Critically examine the role of record player in distance education.

A video cassette recorder (VCR) is an electromechanical device which records and plays back analog audio/video data which has been recorded natively from broadcast television or from other sources on a removable magnetic cassette tape. It revolutionized the movie and television industry by allowing people to watch TV shows and movies on their own schedules. The VCR can record a TV broadcast to be played back at another time, making it very convenient for a working person to watch shows at another time; a practice known as timeshifting.

The video cassette recorder evolved with the history of videotape recording in general, as it is not actually tied to a specific videotape format such as VHS and Betamax. The world’s first commercially successful VCR was introduced by Ampex as the Ampex VRX-1000 in 1956, which made use two-inch tapes, and the Quadruplex videotape professional broadcast standard format. The first home VCR was called the Telcan and was produced in 1963 by the UK Nottingham Electric Valve Company for £60, which today is roughlyequivalent to $1500.

The VCR started gaining mass market success in 1975 because of the emergence of the VHS and Betamax formats, which gave the common consumer more affordable access to magnetic videotape media. It was also due to the fact that six major firms were actively developing VCRs, namely JVC, Ampex, RCA, Matsushita/Panasonic, Toshiba and Sony. The competition meant that prices went down quickly, and by the end of the 80s well over half of homes in the US and Britain had a VCR.

Even with the new technologies emerging such as the Laserdisc and Video CD in the 90s, VCRs still thrived commercially. It was not until the introduction of the Digital Video Disc or DVD that VCR popularity began to decline. DVD was the first universally successful optical medium for playback and pre-recorded videos. As it gained popularity, pricey DVD recorders and other digital video recorders dropped in price, which made VCR sales decline further.

A VCR (videocassette recorder) is an electromechanical device for recording and playing back full-motion audio-visual programming on cassettes containing magnetic tape. Most videocassettes have tape measuring 1/2 inch (1.27 cm) in width. The most common application of the VCR is its use by consumers for the purpose of playing and recording television (TV) programs and for creating home video recordings. A TV camera equipped with a VCR is called a camcorder . The abbreviation VCR can also stand for videocassette recording.

The first VCRs were designed and built in the 1960s, and became available to the public around 1970. The technology rapidly evolved and the equipment came down in price, so by the mid-1970s it was within the reach of the average consumer. Today there are two major types of VCR technology in use, known as VHS (Video Home System) and Betamax . Both types were developed in Japan, VHS by Japan Victor Company (JVC) and Betamax by Sony. VHS systems are far more popular among home TV viewers. Betamax equipment is still used by some professional production engineers, many of whom believe that Betamax offers better image quality. The Betamax tape takes a more direct path through the recording and playback apparatus than a VHS tape, so recording and playback operations are faster and more convenient with Betamax than with VHS. But less wear occurs on a VHS tape, so VHS cassettes last longer. Also, VHS cassettes have more capacity (in terms of recording time) than Betamax cassettes.

In the late 1980s and early 1990s, the VHS and Betamax formats became competitive. For complex legal reasons, VHS captured the home video recording and reproduction market. By 1993, Betamax was essentially obsolete among consumers in the United States. In recent years, the use of video tape has become less common because of the widespread availability and popularity of DVD technology.

The Helical Scan System

In an audio cassette deck, which only registers audio signals, the tape passes over a static recording/playback head at constant speed. The higher the speed of the tape, the more tape particles pass the head opening and the higher the frequencies that can be registered. Thanks to the extremely narrow head opening, it is possible to record and play back the entire tone range, up to 18,000 or 20,000 Hz, despite a slow tape speed of no more than 4.75 centimeters per second.

However, to register video signals, a range of 3.2 MHz is required and so a tape speed of approximately 5 meters per second is a prerequisite. This is over 100 times as fast as the tape speed for an audio cassette deck. The required high recording speed for video recorders is realized by the helical scan system without such high tape speeds. The system basically consists of a revolving head drum, that has a minimum of two video heads.

The head drum has a diameter of approximately 5 cm and rotates at a speed of 1500 revolutions per minute. The 1/2″ (12.65 mm) wide videotape is guided around half the surface of this drum, in a slightly oblique manner. This is achieved by positioning the head drum at a slight angle. The tape guidance mechanism then ensures that the tape is guided through the device at a speed of approximately 2 cm per second (half of the low tape speed that is used in audio cassette decks).

Tape guidance along the head drum with the video heads writing tracks on the tape.

In the meantime, the rapidly revolving video heads write narrow tape tracks of no more than 0.020 to 0.050 mm wide on the tape, next to each other, diagonally. Every half revolution, each of the two heads writes one diagonal track which equals half an image. The first head writes one track, i.e., the first field (the odd numbered scanning lines). The second head writes a second track, i.e., the other half of the image (the second field: the even numbered scanning lines), which precisely fits in the first image. This corresponds to the interlacing principle, as applied in television (see Chapter 2: TV set). One full revolution of both heads results in two diagonal tracks right next to each other, together forming one entire image scan (a frame). This means that two apparently contradictory requirements can be realized simultaneously: low tape speed of only 2 cm per second and at the same time a high registration speed (relative tape speed) of no less than 5 meters per second. These two requirements make it possible to record the high video frequencies up to 3.2 MHz. At the same time, the low tape speed gives a time capacity up to three hours.

Azimuth Settings

Compared with early video recorders, modern day video recorders have their video tracks lying right next to each other. To avoid interference, the two video heads are angled slightly away from each other. As a result, the video head openings that transmit the magnetic tracks to the tape, create an angle between them. The heads are 15 degrees angled in opposite direction, making a total angle of 30 degrees.This diverted registration angle ensures no problems are caused if the heads slightly lose track when playing back and touch the next track. The heads only register tape information at an angle that precisely corresponds to the position of the head opening. This system is called the azimuth recording system. If the video heads stray too far from the track, which could lead to distorted images, tracking control can correct this.

Azimuth settings. The head openings are cut with different azimuth angles, so that the tracks can be written next to each other.

Synchronization Track

The revolutionary speed of the head drum and the video heads needs to maintain a constancy within strict parameters. Moreover, the tracks must be scanned during playback in precisely the same way as they were recorded. Each tape track is synchronized at the recording stage by means of field synchronization pulses. These pulses are generated in the video recorder by a separate head which are recorded on a separate narrow track at the side of the video tape. This is called the synchronization, servo or control track.

Position of the video, audio and synchronization tracks on the tape.

Position of the audio, sync and erase heads inside the VCR.

Video Systems

There are three major video systems in use today:

Video Home System (VHS)     Betamax     Video Hi8

When the video recorders were first introduced, Philips also developed a system called V2000. Despite the fact that is was a high quality system, it was not successful in the market. Although Betamax was reasonably successful at first, its popularity waned and VHS was adopted as the world standard.

Betamax
The Sony Betamax System, launched in 1975, was based on the pre-existing professional Sony U-matic-system. In the Betamax system, the video tape is guided along the head drum in a U-shape for all tape guidance functions, such as recording, playback and fast forward/backward. When the cassette is inserted, the tape is guided around the head drum (called threading). Threading the tape takes a few seconds, but once the tape is threaded, shifting from one tape function to another can be achieved rapidly and smoothly.

The Betamax U-system before (top) and after (bottom) threading.

VHS
JVC’s VHS System was introduced one year after the launch of Betamax. In VHS, the tape is guided through in an M-shape; the so-called M-tape guidance system. It is considered simpler and more compact than the U-system. Threading is faster and is done every time the tape guidance function is changed. It is therefore somewhat slower and noisier than the U-system. This problem is being solved by “Quick-start” VHS video recorders, which allow fast and silent changes in tape guidance functions. To avoid excessive wear, M-tape guidance system recorders are provided with an automatic switch-off feature, activated some minutes after the recorder is put on hold, which automatically unthreads the tape. An improvement of the basic VHS system is HQ (High Quality) VHS.

In the VHS system different starting points were used than in Betamax, such as track size and relative speed. VHS has rather wide video tracks, but a slightly lower relative tape speed, and that also counts for the audio track. In general, the advantages of one aspect are tempered by the disadvantages of the other. The end result is that there is not too much difference between the sound and image qualities of both systems.

The VHS M-system before (top) and after (bottom) threading.

Video Hi8
As a direct addition to the Video-8 camcorders, there is a third system: Video Hi8, which uses a smaller cassette than VHS and Betamax. The sound recording takes place digitally, making its sound quality very good. When using the special Hi8 Metal Tape, the quality of both image and sound are equivalent to that of Super-VHS. The Video-Hi8-recorder can also be used to make audio recordings (digital stereo) only. Using a 90 minute cassette, one can record 6 x 90 minutes, making a total of 18 hours of continuous music. The video Hi8-system also allows manipulating digital images, such as picture-in-picture and editing. Video Hi8 uses a combination of the M- and U-tape guidance system.

Cassette sizes compared.

Sound Recording

Mono

In case of a mono video recorder, the audio signal which corresponds with the image is transferred to a separate, fixed audio head. As in an audio cassette deck, this head writes an audio track in longitudinal direction of the tape. This is called linear or longitudinal track recording.

The video recorder has two erase heads. One is a wide erase head covering the whole tape width which automatically erases all existing image, synchronization and sound information when a new recording is made. The other erase head is smaller and positioned at the position of the audio track. With this erase head, the soundtrack can be erased separately, without affecting the video information. In this way, separate audio can be added to a video recording. This is called audio dubbing, and can be particularly useful when making your own camera recordings.

The linear audio track does have some restrictions. Due to its low tape speed, it is not suitable for hi-fi recordings. Moreover, the audio track is so narrow (0.7 mm for VHS and 1.04 mm for Betamax) that not even stereo sound can be recorded properly. The frequency range is limited as is the dynamic range (which relates to the amount of decibels), and the signal-to-noise ratio is not very high.(The signal-to-noise ratio relates to amount of noise compared to the total signal. The higher this ratio, the less noise and the better the signal will be). The sound quality of the mono track can be improved by a noise reduction system. There is a way to get superior hi-fi stereo sound quality on a videotape (used in hi-fi video recorders,) which will be discussed later.

Hi-fi Stereo Sound

Hi-fi video recorders were developed for improved sound quality. The most common quality of video images is HQ. (The recorder is labeled ‘VHS High Quality Hi-fi Stereo’). Conventional mono video recorders use linear audio registration, which does not allow hi-fi recordings. A special method was therefore devised to record stereo sound with hi-fi quality.

In the case of hi-fi, the audio signal is also put on tape via revolving heads similar to the video signal, not on the linear track. As there is no space between the video tracks, as the video tracks lie right next to each other with no space in between, the audio tracks need to be recorded in the same place as the video tracks. The way this is realized is by recording the audio signal under (deeper than) the video signal.

Hi-fi video recording, where the audio signal is recorded at a deeper level, after which the video signal is recorded on top.

In hi-fi video recorders, the audio signal is modulated to a high carrier frequency. This is realized via FM modulation, with the right channel stereo signal at a slightly higher frequency than the left channel. The corresponding video and audio signals are written to tape immediately after each other. First the FM audio signal is registered at a deep level in the tape’s magnetic coating. Straight after the audio signal, the video signal is recorded. As the frequency of the video signal is higher than the audio signal, it will not register as deep in the tape coating as the audio signal. The video signal erases the audio signal in the top layer and records the video signal instead. Thus, the audio and video signal tracks are written in the same magnetic layer, separately, one on top of the other. The entire magnetic coating is only 0.004 mm thick. To ensure that the two do not interfere, the audio and video tracks are written on tape from a different angle, by means of a different head with a different azimuth setting.

To guarantee compatibility with cassettes not recorded in stereo hi-fi, the fixed audio recording/playback heads remain in place. So, a hi-fi video recorder always has two audio registration systems installed. This offers possibilities for amateur video makers to do audio dubbing using an audio mixer to combine the sound of the hi-fi track with other sounds and to write the mix to the linear audio track. In this way synchronized recordings will be left intact.

Quality Audio Recorders

A hi-fi video recorder is also suitable as a high-quality audio recorder, not only because of the professional recording quality, but also because of the long play possibilities and the low recording costs.

The specifications of hi-fi video sound registration systems equal those of professional tape decks and compact discs. The entire sound spectrum can be covered without any problems, and the dynamic range is 80 dB, close to the 90 dB that compact discs can cover. (As the video recorder is a recording medium, a couple of good microphones can actually cover the whole 80 dB range.) Recordings made on a hi-fi video recorder result in almost unmeasurable wow and flutter and very little harmonic distortion. The low tone quality of a hi-fi video recorder is remarkably good compared to tape recordings of cassette decks. A disadvantage is that sound editing is not possible via a VCR. Instead, the required tape segments can be copied unto another tape without hardly any loss of sound quality.

A hi-fi-video recorder needs to be tuned very accurately. As the two rotating audio heads function alternately, the recorded sound consists of successive particles and need to fit together perfectly. If they do not, the result is rumble, which is a humming sound. In high quality, well-tuned hi-fi video recorders you will not hear this sound

Super-VHS

Super-VHS or S-VHS (for Betamax: ED-Beta)is a major step forward in the field of video registration. It is a recording-playback system of such high quality that its recordings are equal to the quality of direct TV broadcast signals. S-VHS offers better image quality than normal VHS, fuller colors, more sharpness, clearer color separations and color fields, and eliminate moire effects. Details not visible on normal VHS, become visible on S-VHS, such as fine fabric patterns and eyelashes. As in all video recording systems, recording image and sound on magnetic tape involves the actual image, the colors, the horizontal and vertical synchronization pulses for perfect image building, and finally the sound. S-VHS requires so much information that it takes a frequency band of 7 million Hz (7 MHz) to store all the information. As this would be too much, in S-VHS the 7 MHz bandwidth is reduced to 5 MHz, without seriously reducing the image quality.

The frequency ranges of sound, TV and VCR. The original 7 MHz are reduced to 5 and 3.2 MHz. S-VHS can register the full TV bandwidth. The Y and C signals are put separately on tape and separately transferred to the TV when played back.

Signal Separation
However perfectly the helical scan system works, normal VHS video recorders cannot register the entire 5 MHz range that comes through via a television broadcast. The bandwidth is reduced to 3.2 MHz at the expense of quality, meaning reduced sharpness, detail, and clarity of color transitions and more noise. Taking away almost 2 MHz is not a matter of simply filtering the signal, as that would lead to the loss of essential information. The bandwidth is reduced by separating the interwoven Y and C signals and putting them on tape separately. When played back, both components are re-mixed to one signal and then transferred to the television set, together with the sound and synchronization signals. In S-VHS the reduced bandwidth is brought back to its original full 5 MHz. In order to achieve this, new video heads and a superior kind of tape were developed, with higher recording density and a smoother tape surface, for optimal head-tape contact.

Resolution
Due to the increased bandwidth and the increased dynamic range of the brightness (Y) signal, the resolution of the S-VHS recording is higher than VHS. Resolution relates to the number of distinguishable adjacent vertical picture lines. As the vertical picture lines are placed next to each other and virtually placed on a horizontal line, we also speak of horizontal resolution. Increased horizontal resolution means more detail is visible, resulting in a brighter image, clearer picture lines and smoother image fields. S-VHS has a resolution of 400 picture lines, compared to 240 picture lines in VHS, and 300 in the conventional TV signal. Moreover, a sub-emphasis-circuit suppresses image noise, particularly for weak video signals and also contributes to better image quality.

In S-VHS the brightness and color signals (Y and C) are not combined in the usual manner when played back, and are not sent to the TV as a composite signal, but are transmitted separately. This separated transmission takes place via a special cable and a connection socket: the S-connection (S = Separated Y/C). The result is that the cross color between these signals is largely decreased, which has a positive impact on the color separation.

AIOU Solved Assignment Code 8622 Autumn 2021

Q.5 Discuss the importance of slides documentary in distance education.

Slideshows that combine text with appealing visuals can be a nice alternative to text readings. They’re also much quicker and easier to create than videos, and viewing them requires a lot less internet bandwidth.

These kinds of slideshows, which students would click through on their own, should ideally be designed with that independent experience in mind. Unlike a presentation that you give live, these would not be simply a support for a speaker; they’d need to stand completely alone, with all the necessary text on the slides, rather than in the speaker notes.

Rather than sharing these as a file, which would require you to adjust editing permissions and would likely result in the student looking at it in editing mode, you can share it in presentation mode so the slides take up the full screen and it feels more like an experience, like in this sample presentation.

Content can also be delivered via audio, which can be a nice change of pace for students who are used to text or videos only, and because audio files are much smaller than video files, they’re less likely to strain home internet capabilities.

  • Listenwise houses a fantastic library of curated, standards-aligned podcasts.
  • You can find lots of other podcasts on pretty much any topic imaginable. Here are some good ones that are produced specifically for student audiences.
  • Recording your own audio instructions, reflections, or lectures is simple and quick. You can use a voice recording app on a smartphone or a web-based tool like Online Voice Recorder to create MP3 files, which are just audio files, then upload those files right into whatever platform you’re using to distribute information to students. In many cases, students will be able to play them right on that platform.

Below I’ve listed some really good ones. Before you jump on these, be mindful that these end products still need to align with your instructional goals and are not what we call Grecian Urn assignments, creative-looking projects that don’t ultimately have much instructional value.

  • Book: Students create a children’s book, mini-textbook, handbook, comic, or other kind of book. These can be done on paper or created with apps like Book Creator.
  • Google Tour: Using Google Tour Builder, students can create customized tours that combine photos, text, and targeted locations on Google Earth. These could be used to create tours that explore current events, historical periods or phenomena, science or geography topics, global research topics, students’ personal histories or future plans, or completely fictionalized stories that take place in various locations around the world.
  • Infographic: On paper or using a tool like Piktochart, have students create an infographic to represent or teach about an idea or set of data.
  • Lesson: Have students write their own lesson on a chunk of your content. Provide them with the basic structure of a lesson to follow, including objectives, direct instruction, guided practice, and some sort of assessment to measure their success.
  • Model: Students can create a physical model representing some aspect of your curriculum, then photograph it from various angles or create a video tour of the model with their own narration.
  • Museum or Multimedia Collection: Have students curate a collection of artifacts representing a curricular concept, along with their own written captions, in a Google Slides presentation. Two fantastic resources for gathering these artifacts are the Smithsonian Learning Lab and the Google Arts & Culture website.
  • Podcast: Have students use the recording tools mentioned in the “Audio” section above or an app like Anchor to record a podcast where they express an opinion, tell a story, or teach about a content-related topic. If students have a lot of material, they can break their podcast into multiple episodes and do a series instead.
  • Scavenger Hunt: Have students participate in a content-based scavenger hunt and take photos to record their findings. An app like GooseChase can make this even more fun.
  • Sketchnote: Have students create a sketchnote to represent a content-related topic using paper or with a drawing app like Sketchpad.
  • Video: Students can create their own videos as creative, informative, persuasive, or reflective pieces. These can be public service announcements, commercials, mini-documentaries, instructional videos, short feature films or animations, or TED-style talks. Tools for creating these can range from quick response platforms like Flipgrid, to screencasting tools listed in the previous section, a tool that creates stop motion videos like Stop Motion Studio, or simple online video creators like Adobe Spark.
  • Website: Using tools like WeeblyWix, or Google Sites, have students develop a website to document a long-term project or teach about a particular idea.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *