Byte, marzo del 86

Proseguimos con nuestro proyecto de leer la revista Byte, cuarenta años más tarde. El resto de entradas de la serie, como siempre, las encontrarás en la etiqueta Byte de obm. En enero habíamos hablado del Atari 520ST y este mes toca el hermano mayor, el 1040ST.

Portada de la revista Byte. La ilustra una foto cenital de un ordenador, con su monitor (de tubo, claro) y su ratón de dos botones. Hay un segundo tema de portada, Homebound Computing

Destaco, en primer lugar, un megaanuncio de Microsoft publicitando sus lenguajes de programación.

Solo pongo la portada, porque se alarga hasta siete páginas más, que el catálogo de Redmond era extenso y su presupuesto para publicidad profundo. Sirve el anuncio, además, para ver cuáles eran los lenguajes que Microsoft consideraba importantes en la época. Arrancan con C («First with the pros«), Macro Assembler («The Quickest. Bar none«), FORTRAN («The overwhelming favorite«), COBOL («The interactive edge«), Pascal («When you’ve outgrown the others«) compartiendo página con QuickBASIC («BASIC just got faster«), que ilustraban, si no me equivoco, con una HP12, y cerraban con LISP («The language of artificial intelligence«), muMATH («Mainframe math on your PC«) y Sort («Versatility without compromise«), compartiendo página también.

¿Quién esperaba ver a FORTRAN y COBOL ahí arriba en 1986? Yo, no. Ni siquiera recordaba que Microsoft hubiese tenido un sistema de álgebra simbólica…

Y de la publicidad de lenguajes saltamos al programa que, en una realidad paralela a la nuestra, ocupa el lugar de Photoshop: el mítico Deluxe Paint (aquí, una carta de amor al Deluxe Paint III), que en aquella época no le sacaba los colores (see what I did there?) a los de Adobe… porque Photoshop 1.0 no llegaría al mercado hasta el muy lejano año de 1990.

Drawing and Painting Program for Amiga

Electronic Arts has released a graphics package for drawing and painting with the Amiga. Deluxe Paint, first in a series of arts software for the Commodore machine, has 20 drawing tools, 7 painting modes, 14 special-effects tools for brushes, 10 built-in brush shapes, and a palette of 32 colors (out of a possible 4096).

Deluxe Paint's drawing tools include magnify and zoom functions that let you split the screen into a normal image and a magnified portion of the image. As you zero in on and alter details in the magnified window, changes are reflected in the normal window. Another tool lets you customize paintbrushes. Anything you can draw can be framed, picked up, and used as a new paintbrush.

The package offers four types of brushes: circles, squares, dots, and airbrush. You can rotate any brush 360 degrees, flip it vertically or horizontally, stretch it into new shapes, or shear its angles. Shading and smearing capabilities help with texture and nuance.

You can create animation effects with what Electronic Arts calls "color cycling—cycling a variety of colors through a static picture to concoct the illusion of motion. You can use three different color cycles and speeds per picture.

Five color controls let you handle the mix of red, green, and blue and adjust the hue and brightness of each color. The software enables the Amiga to automatically generate the shades of color between any two pigments you pick.

Deluxe Paint, priced at $79.95, is designed to work with two other programs still in the Electronic Arts workshop, Deluxe Print and Deluxe Video Construction Kit (reportedly slated for April release). It requires 2 56K bytes of RAM and Kickstart 1.1.

Me paro un momento en la sección de libros para recordar al muy mítico (en aquella época) Peter Norton. Si tuviste un PC con MS-DOS, muy probablemente recuerdes sus míticas Norton Utilities (que ¡siguen existiendo!), pero a lo mejor no sabías que era un autor de «best sellers» sobre la programación del PC.

BOOK REVIEWS

THE PETER NORTON PROGRAMMER'S GUIDE TO THE IBM PC Peter Norton Microsoft Press Bellevue, WA: 1985 426 pages. $19.95

THE COMPUTER CULTURE Denis P. Donnelly, editor Fairleigh Dickinson University Press Cranbury, NJ: 1985 176 pages. $24.50

MICROSOFT MACINATIONS Mitchell Waite, Robert Lafore, Ira Lansing Microsoft Press Bellevue, WA: 1985 497 pages, $19.95

THE COMPUTER LAW ANNUAL 1985 Miles R. Gilburne, Ronald L. Johnston, Allen R. Grogan, editors Harcourt Brace Jovanovich New York: 1985 405 pages, $60

THE PETER NORTON PROGRAMMER'S GUIDE TO THE IBM PC

Reviewed by Donald Evan Crabb

Although the information Peter Norton provides in The Peter Norton Programmer's Guide to the IBM PC is not new or unique, reading it is an education. The book picks up where Norton's Inside the IBM PC (Robert J. Brady Co., 1983) left off. Whereas the earlier book concentrates on the hardware components of the IBM Personal Computer and how they work together, the new book is written strictly with the PC programmer in mind. Both works combined provide a comprehensive technical reference to the PC. As you might expect, these books cover some material in common. For example, both explore the ROM BIOS of the PC. But the discussion in The Programmer's Guide is designed for the programmer. In fact, this book should be useful to anyone who needs to understand the technical details involved in creating PC programs. Norton makes the distinction right from the start that he is providing more than just PC programming knowledge. He is trying to impart concepts about PC programming.

Norton also concerns himself with the philosophy of programming the PC. He laces the book with explanations about the design concepts that permeate the entire IBM PC line. Due to Norton's wealth of experience working with PCs, this information is synthesized so that it is more useful than the usual dry engineering discussion that you often get in books of this kind. He carefully divides the ROM information into four chapters: ROM BIOS basics, ROM BIOS video services, ROM BIOS disk services, and ROM BIOS keyboard services.

The Programmer's Guide details the original PC. But keep in mind the subtitle of this book: "The ultimate reference guide to the entire family of IBM personal computers." Norton explains differences between the design, construction, and systems software of the other members of the PC family and the PC. Most of the examples and information describe the Intel 8088 microprocessor and how it's programmed through the services provided by the ROM BIOS and by DOS. Many of the programming examples use BASIC as the representative high-level language. Pascal and C-language examples also appear. Norton shows how to write 8088 assembly-language interface programs for each of these languages.

The scope of the book extends to a number of programming areas. From video and disk basics, Norton moves from how the keyboard operates in programs to all the programming aspects of DOS. The final two chapters, "Program Building" and "Programming Languages," are worth the price of admission alone. Norton covers the conceptual basics of writing, compiling and interpreting, linking, and executing programs. Especially informative are the discussions of the DOS LINK program and the logical organization of assembly-language programs.

Norton discusses both the IBM Pascal compiler and the generic Microsoft Pascal compiler. He discusses Pascal data formats on the PC and how to work with them, as...

(Si tuviese más tiempo y dinero de los que tengo, seguramente haría por obtener una copia del Computer Law Annual 1985, porque por la crítica parece digno de leer con curiosidad histórica, con artículos sobre los problemas de propiedad intelectual de la ingeniería inversa, o la normativa antitrust, aunque también dicen que está escrito para juristas.)

No me alargaré mucho con el ordenador protagonista del número: a estas horas ya deberíais tener claro que el ST era mejor que el Mac (y del Windows de la época ya ni hablemos)… pero que estaba por debajo del Amiga (con la excepción de las aplicaciones musicales: ¿sabíais que Fatboy Slim sigue usando un ST?).

The Atari 1040ST

A megabyte of memory for $999

Editor's note: The following is a BYTE product preview. \t is not a review. We provide an advance look at this product because we feel that it is significant. A complete review will follow in a subsequent issue.

Atari's new $999 1-megabyte 1040ST (see photo 1) establishes a price break reminiscent of the Commodore 64's. And, as table 1 shows, the 1040ST will be the first computer to begin its retail life at a price that represents less than one dollar per kilobyte. The 1040ST is clearly a bargain, with over 1 megabyte of RAM (random-access read/write memory), its operating system in ROM (read-only memory), an internal 720Kbyte double-sided drive, an internal power supply, and the same features and functionality that already make the Atari 520ST an attractive purchase. (Editor's note: See "The Atari 520ST" by }on R. Edwards, Phillip Robinson, and Brenda McLaughlin. January BYTE, page 84. |

System Description

Our coverage of the 520ST adequately describes most of the features of the 1040ST (see also the "In Brief" box on page 86). The new computer has the same keyboard, the same ports (although these are now in new locations, see photo 2), and the same architecture. We remain uncomfortable with the keyboard, but the keytops are removable. We suspect that some speedy entrepreneur will provide alternative tapered keys for the ST machines.

The most obvious changes are cosmetic: The keyboard/computer unit is 2 inches deeper and 4'/2 pounds heavier than the 520ST and the keyboard provides a much more substantial feel. The mouse/joystick ports are now located under the bottom right front of the unit, a significant improvement for left-handed users.

A number of changes are more than cosmetic. The internal power supply eliminates two of the external power supplies needed by the 520ST (wire haters rejoice). We left the unit on for five days and experienced no difficulties with overheating. There is no internal fan, but the unit appears to adequately dissipate heat. The internal disk drive supports both single and double-sided disks. An RF (radio frequency) modulator will allow you to hook up the 1040ST to a television set; you might, therefore, obtain the high-resolution monochrome system for word processing and programming without sacrificing the use of low- and medium-resolution color. However, we received a preproduction unit lacking the RF modulator that will accompany the final product; therefore, we were unable to test the television quality of the computer's output.

The megabyte of RAM in the 1040ST isn't crammed into the case. The 520ST uses a custom Memory Controller chip to handle its sixteen 2 56K-byte dynamic RAM chips. The 1040ST uses the same Memory Controller. Because the controller can handle 32 RAM chips at a time, the Atari engineers simply had to find room for 16 more 2 56K-byte dynamic RAMs on the 1040ST circuit board to pump RAM capacity to a full megabyte (see photo 3). In fact, the Memory Controller can also govern 1 -megabit dynamic RAM chips. Atari should have little difficulty designing an ST with 4 megabytes of memory. Undoubtedly, the most interesting addition to this computer, apart from the extra memory, will be an empty socket for a graphics coprocessor. Our preproduction unit also did not include the socket, and it may not be offered with the first releases of the 1040ST Phil Robinson discussed this and Atari's future plans with Shiraz Shivji, vice president of research and development for the company (see the text box "An Interview with Shiraz Shivji" on page 90).

TOS IN ROM

With TOS (the operating system for both the 520ST and the 1040ST) in ROM, the 1040ST boots more quickly than the 520ST. [Editors note: Atari is currently supplying the ROM chips to 520ST developers and will be making the chips available through users groups.] Booting with a nonsystem disk takes less than 6 seconds, down from 37...

Los de mirada afilada habréis notado que en la portada, además del ST, había un segundo tema, el «homebound computing». ¿Que qué es eso? Nos lo cuenta el primer artículo del tema, que comienza recordándonos que el teletrabajo no lo inventó el COVID, y que ya se hablaba de ello hace cuarenta años, pero que anuncia que el tema se va a centrar en la informática como ayuda para las personas que, por el motivo que sea (una discapacidad, por ejemplo), no pueden desplazarse.

WORKING AT HOME WITH COMPUTERS

by Jane Morrill Tazelaar

For some, telecommuting is a choice: for others, it is the only option

THERE ARE MANY REASONS for wanting to work at home. Some of them involve disabilities that make it impossible or impractical to work in an office. For many disabled persons, the alternative to working at home is no alternative at all; it is the only work they can do, the only way they can become productive members of the work force. Some people work at home because of priorities such as being available to young children, especially during the preschool years. Many parents of small children must choose between sending them to daycare centers or babysitters and earning a reasonable living. Other reasons involve the lack of local work opportunities in economically depressed or geographically remote areas. And for some people, working at home is a simple matter of choice. The electronic cottage, the flexiplace, telecommuting, worksteading, or whatever you wish to call it could be the answer to all these problems.

Starting Your Own Business

Starting your own business seems to be the answer for many people. There is a certain romance in the very word entrepreneur, and there is a great deal of information available to help you get started if the idea of being one appeals to you. The various organizations associated with the cottage industry movement— the Association of Electronic Cottagers, the National Association for the Cottage Industry, and the National Alliance of Homebased Businesswomen— offer good and helpful information for the aspiring entrepreneur. (See the text box "Sources Mentioned" on page 1 56 for addresses and phone numbers.) They also offer countless references, contacts, and, possibly most important, support groups.

Two subjects seemed to jump out at me from all the literature I have seen on starting your own business: selling and networking. First, if you don't want to sell, you probably don't want your own business. Most business ventures involve direct selling. Whether you provide a product or a service, you need to sell it. Even if you have salespeople working for you, you have to sell the idea to some financial institution to get funds to get started and to the people you hire to get them to work for you. Then, long after the basics are in place and you are operational, you'd better be prepared to talk about your product or service, explain its value, and convince a prospective buyer that he or she ought to do business with your company instead of someone else's— in other words, sell.

Networking is a way to increase your contacts. Whether people are business contacts or social contacts or both, they can add significantly to the success of your business. The people you know either personally or electronically are sources that can provide you with future customers, business partners, financiers, good tax accountants, awareness of your competition, discounts on computing equipment, the inside story on future technology, and so on.

Starting your own business is a very rewarding venture for some and a complete disaster for others. At the...

Me salto el artículo sobre síntesis de habla a partir de imágenes para pasar directamente al e-learning, que me toca la fibra especialmente.

THE ELECTRONIC UNIVERSITY NETWORK

by Donna Osgood

Get a degree without ever leaving your computer

UNTIL RECENTLY, education at home meant correspondence courses. Taking courses through the mail is a slow, cumbersome way to learn, and for many people it means missing out on a vital part of the education experience: contact with a human instructor. Without that, you can easily lose interest in the course and drop out.

Meanwhile, as the baby boom generation passes, colleges faced with declining enrollments are looking for ways to reach a wider range of potential students. They need to reach people who would not ordinarily be willing or able to matriculate in the traditional way.

TeleLearning's Electronic University Network addresses both problems. Through the Network, universities offer accredited courses to students who enroll, participate in "class," interact with instructors and other students, and take tests on the material they have studied, all without leaving their microcomputers. Since classes are small (usually 10 students per instructor) and feedback on each assignment comes within a day or so, students taking courses from colleges through the Electronic University Network get much more individual attention than they would in a large class on campus. Some of the other benefits of telecommunication apply here as well: An instructor can judge a student only on the basis of his or her work, without interference from preconceived notions and biases based on how the student looks, speaks, or acts.

Founded in 1983, TeleLeaming began offering accredited courses in March of 1984. In January 1985 it established full-fledged degree programs, and it now offers two associate degrees, two bachelor's degrees, three MBAs, and specialized professional certificates. The degrees are awarded by fully accredited colleges (Thomas A. Edison State College in Trenton, New lersey, City University in Bellevue, Washington, and lohn F. Kennedy University in Orinda, California). The Electronic University itself offers no credit, acting solely as a coordinating medium and resource center for students. About 17,000 students have enrolled.

Courses available through the Electronic University Network include noncredit courses for personal improvement (writing, computer literacy, drawing, and wine appreciation, for example), business and professional skill classes (time management, accounting, and business math, among others), and tutoring programs for children (reading, math, and computer literacy). Courses for credit span the humanities, natural sciences, mathematics, social sciences, and business at undergraduate and graduate levels.

How It Works

You enter the Electronic University by buying an enrollment package for $150. This one-time fee covers operating software, communications software, and lifetime enrollment in the Electronic University for your entire family. Tuition for individual classes is handled separately. The admissions questionnaire and class registration...

Y es que en 1986, el año en que los módems de 2400 baudios nos parecían lo más, la educación a distancia ya no era necesariamente por correspondencia, gracias a la Electronic University Network de Telelearning. Hay más sobre el tema en Tedium.co y en eLearning Inside, pero el artículo nos explica que la cosa no era ni siquiera «lo último de lo último», porque se había fundado en el lejano 1983 y tenía programas de grado oficiales desde 1985, con 17 000 estudiantes matriculados hasta entonces. Eso sí, los materiales de aprendizaje no eran en línea: te enviaban el libro a casa. Tecnología necesaria: un IBM PC o PCjr, un Apple II o un (cómo no) Commodore 64 (el 35% de los matriculados no tenía ordenador al apuntarse). ¡Y había hasta mensajería instantánea!

Pasar a otro tema del que también, oh sorpresa, seguimos hablando hoy: envejecimiento y tecnología.

INCREASING INDEPENDENCE FOR THE AGING

by K. G. Engelhardt and Roger Edwards

Robotic aids and smart technology can help us age less dependency

FOR THE FIRST time in history, a significant portion of our population is living to be senior citizens, and we have no experience in caring for large numbers of healthy, literate, articulate older persons, many of whom are highly educated. As our society grays, we need more ways to help increase the independence of those with chronic and multiple disabilities. Rapid advances in microprocessor based technologies are providing us with many new possibilities. Their miniaturization, flexibility, modularity, and ever-decreasing costs now make it possible to realistically address human problems that we could not just 10 years ago.

The need to control our environment and our lives in order to reduce dependence is critical to human development. Loss of personal independence is costly, not only in actual dollars spent on institutional and long-term care, but also in emotional and psychological terms. The need to reduce premature and unnecessary institutionalization of our elderly citizens is critical. We need more devices that will increase the independence and the sphere of control of individuals with disabilities and to augment the care givers' tasks with state-of-the-art tools to help them provide better care.

This article discusses potential applications of microprocessor-based technology for increasing independence in those with declining abilities. From panic buttons to smart houses, from stationary telemanipulators to self-navigating robots, from memory-aid devices to expert systems for daily living, microprocessor based technology can assist the functionally dependent older person.

Applications

An applications team was formed during the winter of 1984 to investigate potential uses for robots and robotic related technologies. The team identified 54 subgroups of tasks and divided them into 12 major categories: patient transport-lift-transfer, housekeeping, ambulation (walking patients to help prevent bedsores), physical therapy, depuddler (urine cleaner), surveillance (to help with wandering patients), physician assistant, nurse assistant, patient assistant, vital-signs monitor, mental stimulation, and one miscellaneous group. Let's look at some possible robotic applications in a few of these groups. Lifting and Transferring: The challenge of lifting and transferring individuals with partial or total paralysis, extensive weakness, or increased fragility due to age is significant. One robotic solution could be a track mounted robot arm that glides along the ceiling until it reaches the room to which it has been summoned. The care giver or the older person could then direct the arm to assist in lifting or transferring the individual from bed to chair or wheelchair to bath, for example. This assistance could also help...

Que sí, que hace cuarenta años ya nos interesaba si las tecnologías digitales podrían ser una ayuda para una sociedad que envejecía cada vez más. Y a continuación viene el tema en que nos paramos cada vez en obm: ordenadores y discapacidad visual.

COMPUTING FOR THE BLIND USER

by Aries Arditi and Arthur E. Gillman

Some special human factors must be considered in assembling a workable system

INEXPENSIVE COMPUTERS and nonvisual communications hardware have, in theory, made personal computing as accessible to blind as to sighted persons. But in practice, personal computing has its own special set of problems for the blind user. In this article we'll present some of the human-factors issues specific to nonvisual personal computing. Our concern is to make computers more accessible and efficient for blind and visually impaired persons. We hope our suggestions will be useful to individuals and to designers of hardware and software. Many of the improvements we discuss below can be implemented in several ways, often in more than one component of the system. They are intended to illustrate human-factors issues rather than to critique specific products.

The system we use as a basis for this discussion is a popular one for blind and visually impaired users and is inexpensive enough for home use as well as employment settings. It consists of an Apple IIe microcomputer operating under DOS 3.3, a Votrax Personal Speech System for voice output, and Raised Dot Computing's Braille-Edit program version 2.44a. [Editor's note: There is a more recent version of Braille-Edit with a number of new features and enhancements. See Henry Brugsch's review, "Braille-Edit", on page 251. Also, for an address list for manufacturers of products mentioned in this article, turn to page 208. | Most blind users have a printer for producing sighted (conventional) hard copy. Another useful peripheral is a braille printer, since braille hard copy is easier to proofread than voice output. While we will not specifically discuss braille hard copy, many of the human-factors issues discussed here are relevant to the design of braille printers.

Braille-Edit is an integrated software package designed to satisfy most blind users' needs to process documents. It is intended for use with a low-cost artificial-voice system such as the Votrax Personal Speech System or Street Electronics' Echo series (including the Echo+ speech synthesizer) and various other peripherals. Braille-Edit is not intended to (and does not) make all programs that run on the Apple accessible to the blind user, nor is it particularly useful in programming the computer. But it has a number of desirable utilities for the blind user, such as a translator of text to and from grade II braille (a commonly used coding system similar to Speedwriting shorthand) that makes impressively few errors and a copy facility for copying files to and from a paperless brailler such as the Versabraille from Telesensory Systems Inc. (TSI).

The hardware and software designed to make a system accessible to the blind user can be viewed as an...

El primer párrafo se podría haber escrito hoy… Cuarenta años más tarde, las tecnologías son 10 000 veces mejores, pero los problemas, o son los mismos o hemos introducido barreras nuevas para compensar las que (afortunadamente) hemos tirado abajo.

Y una vez cerrada la sección, nos pasamos a la guerra que se mantenía entre los ordenadores con CPUs Motorola 68000: el Mac, el ST y el Amiga (en orden rigurosamente creciente). Bruce Webster, uno de los autores estrella de la revista, hacía una comparativa que se alargó tanto que en este número solo cabía la primera parte, cuya conclusión es que el Mac está más maduro (llevaba mucho más en el mercado9, el 520ST es una ganga y el Amiga se veía lastrado por los problemas de Commodore, pero era el que apuntaba más maneras…

68000 Wars: Round 1

It is late November— almost Thanksgiving—as I finish writing this, and the editors back in Peterborough are screaming for my column. It's nice to be wanted, eh? However, if 1 don't wrap this up and upload it posthaste, I may be wanted in two or three states. Worse yet, a certain managing editor may end up being wanted for manslaughter, though I doubt if any reasonable jury would convict him. I must type faster...

I now have all three of the prominent 68000 machines: Apple's Macintosh, the Atari 520ST, and Commodore's Amiga 1000. I've had the Mac for nearly two years and the ST and Amiga for less than two months. And the single most common question I get these days is, "What do you think of the |one of the above| as compared to |one or two of the others above|?" Also, a lot of claims and counterclaims have been floating around concerning the relative merits of and problems with the three machines. In this column and the next few, 1 hope to sort out fact from fantasy and present some well-supported— if not completely objective— opinions. (Note: "Objective opinion" is an oxymoron, that is, a self-contradictory phrase, like "intelligent idiot" or "deliverable vaporware.")

Mac versus ST versus Amiga

The format of this column is simple. I'll take a number of different areas, one by one, and give my opinion on how the three machines stack up— who wins, who places, who shows. Where possible, this is based on direct experience. However, since I am not all-seeing or all-knowing, I have asked questions of those with more experience or knowledge, and I've done my best to acknowledge them at the end of the column.

Appearance and Physical Setup

The Macintosh wins this category easily. My 2-megabyte Mac has a 20-megabyte hard disk (MacBottom) and two disk drives (internal and external). The design is clean, attractive, professional, and unique. Better yet, the system takes up less than one square foot (9Vi by 9/2 inches) for the main unit and another 6 by 13 inches for the detachable keyboard, which can be easily moved 3 to 4 feet from the main unit. The Mac itself needs only one power outlet, though the hard disk requires its own as well. It definitely looks good in an executive suite and won't eat up all your desk space.

The Amiga comes in second, resembling the IBM PC (though, in my opinion, it looks nicer). The main unit is 17!/$ by 13 inches, covering more than twice as much desk space as the Mac. The monitor stacks nicely on top of the main unit, but a second (external) disk drive must sit to one side and takes up an 8- by 6-inch area. The detachable keyboard (6 by 16 inches) is larger than the Mac's, but it slides nicely under the main unit when not in use, and it can be moved almost as far as the Mac's. One warning, though: Since the expansion bus is on the right side of the Amiga, adding hardware is going to cause the Amiga to grow wider. The basic system (with two drives and monitor) needs two power outlets.

The ST comes in last, for reasons groused about last month. The main unit is 18!/$ by 9 l /2 inches. It therefore takes up less space than the Amiga, but neither the monitor nor the external disk drives can stack on it, so a complete system takes up much more desk space than either the Mac or the Amiga: A conservative estimate is about 21 by 21 inches. The keyboard is built into the main unit, so you don't have the additional space requirements for that, but you also don't have the flexibility of a detachable keyboard. The ST has a nice design but looks much like a home computer (which it is). Most unfortunate are the thick cables and external power supplies— one for the main unit and one for each disk drive. A basic system with monitor and two disk...

En nuestra sección habitual, temas que ni por casualidad encontraríamos hoy en una revista de informática… ¡las ecuaciones diofánticas!

Diophantine Equations

A man buys some x's at $154 each and some y's at $69 each. If he spends a total of $5000, how many of each did he buy?

Although this problem appears to be from a first-year algebra text, we find that the techniques required are not usually found in a "mainstream" course in mathematics. The equation 1 54x + 69y = 5000 has infinitely many solutions. However, assuming the man bought whole-number quantities, we want integral solutions [x,y] for the equation, and now we need a method for solving such equations.

Diophantine Equations

Equations of the form ax + by = c, for integral a, b. and c and integral solutions (x.y). are called Diophantine equations. No one is certain when or where Diophantus of Alexandria was born. Sources vary from "born about A.D. 50" to "flourished about A.D. 2 50." He is called "the father of algebra," having promoted algebraic notation and algebraic treatment of mathematical problems. Previously, such work was done by "rhetorical algebra" or geometric proofs.

A variety of methods are available for solving Diophantine equations. One of these is modulo arithmetic, a powerful and fascinating concept that 1 may explore more closely in a future column.

A very simple method of solving our original problem comes to mind. Since the equation is equivalent to y = (50001 54x)/69, we can simply try consecutive values of x (from 1 to 32 only) until we get an integral value for y.

Since we are mathematically inclined, such an inelegant approach may not sit well with us. Rather, we may prefer to look for a method of solution based on general principles of mathematics. What can we say in general about integral solutions for an equation of the form ax + by = c?

First, we can readily see under what conditions the equation would have no solution. Consider the greatest common denominator (GCD) of a and b. We will call it d. If d is not a factor of c, the equation will have no integral solutions. Why? Since aid is, by hypothesis, an integer and bid is also one, the value (ald)x + (bld)y will be an integer if x and y are integers. That is, the integers are closed under addition and multiplication. Thus, if eld is not an integer, either x or y must not be an integer.

Diophantus Meets Euclid

This leads us to Euclid's algorithm, which was the subject of my last column in January (page 397). If we employ Euclid's algorithm to determine the GCD of a and b, we can immediately determine whether there are integer solutions to the Diophantine equation by dividing the GCD into c. But we can use Euclid's algorithm for much more than that. To see how, let us reexamine the algorithm with an eye toward solving Diophantine equations. Figure 1 outlines the way the Euclidean algorithm finds the GCD of 1 54 and 69. Their GCD is 1, meaning that the two numbers are relatively prime. Now, to begin our examination of the way to solve Diophantine equations, let's modify our original equation to 154x' + 69y' = 1. That is, we will begin with the case where c is equal to the GCD.

In figure 2, I have rewritten the divisions of figure 1 as equations. In order to find integer values of x' and y' that solve the equation 154x' + 69y' = 1, all 1 need to do is substitute 154-2(69) for 16 in equations 2 and 3 and 69-4(154-2(69)) for 5 in equation 3. After collecting terms, I find that 1 = 13(154)-29(69). Thus, x = 13, y — 29 will satisfy the equation 154x' + 69y' = 1. We will call (13,-29) the basic solution to 154x' + 69y' = 1. Is it the only solution?

Let us write our equation in the general form again: ax + by = c. Now, let n be any integer and d be the GCD of a and b. If we add to the left-hand side of the equation, we haven't changed it...

(Y, para que no digáis que abuso del tema, me he saltado un artículo que comienza con «Windows can be implemented on almost any system with a memory-mapped display»…)

Me paro en la sección de BIX (ya recordaréis: el extracto en papel que hacía Byte de las conversaciones en su servicio en línea) para contemplar el nacimiento de IFF, el metaformato de archivos presentado por Electronic Arts que debería ser la base de cómo trabajamos hoy, y que permitía encapsular múltiples tipos de información (texto, gráficos y audio, para empezar) en un único archivo.

IFF Graphics Protocol

amiga/softw.devlpmt #157, from gregr [Gregg Riker, Electronic Arts]

TITLE: IFF (Information Format Files) Is Available!

I mentioned that I used IFF files with the SlideShow. Allow me to elaborate.

Electronic Arts has a general interest in promoting standards, so we knocked heads with some people at Commodore-Amiga and came up with IFF.

IFF is intended to be used by any and all interested developers. It offers a convenient way of allowing programs to exchange data with one another.

For example, Graphicraft will be able to exchange files with Deluxe Paint and other EA products. The design is extensible, in that you may add your own types to the standard. There are programs available in C (public domain!) that will read and write graphic images in IFF format.

If you're interested in a copy of the spec, please contact Rob Peck at Commodore-Amiga. He can supply you with a copy. If you have any problems or need more information, please contact Jerry Morrison at Electronic Arts, (415) 571-7171.

P.S.: IFF covers graphics, audio, and text and is expandable!

Y cierro con una curiosidad. A estas alturas deberíais estar tan enamorados y enamoradas de las ilustraciones de Byte como yo… y esto es tan claro que ya en aquella época la revista vendía ediciones limitadas de sus portadas:

Anuncio de dos ediciones limitadas de reproducciones de portadas de la revista. Una de ellas es una mano robótica dibujando una mano humana junto a una mano humana dibujando una robótica, y la otra es una ilustración de un disquete de cinco pulgadas y cuarto.

Y hasta aquí la Byte del mes. Si queréis hacer los deberes para el mes que viene, como siempre, aquí tenéis los archivos de la revista Byte en archive.org.

Pero, como venimos haciendo últimamente, no nos iremos sin darle un repaso a los episodios del mes de Computer Chronicles

El primero no es especialmente apasionante, y se dedicaba a las carreras profesionales en informática, incluyendo la emprendeduría… Una cosa a destacar es que no era totalmente necesario tener un grado en informática para encontrar trabajo en el campo: bastaba con unas cuantas asignaturas desde otros grados. Por cierto, que ya hablaban de la importancia de las competencias comunicativas… y ya se comentaba que las mujeres se iban a computer science y no a computer engineering. Tremendo, eso sí, en los breves del final del episodio, cómo AT&T presentaba un sistema de correo electrónico «de bajo coste»: ¡40 céntimos por enviar una página de texto! (Un sello costaba 22). Y con servicio de entrega en mano para personas sin correo electrónico por… ¡siete dólares y medio! ¡De la época! Todo ello mientras IBM presentaba un procesador experimental con 93 000 transistores (el procesador del iPhone 16 tiene… quince mil millones)

En el segundo episodio se hablaba de la computación en paralelo. Solo por ver un superordenador Cray de la época ya vale la pena darle al play. El H. T. Kung al que se entrevista a medio programa, por cierto, está a un grado de separación de Deep Blue, de unos de los primeros «gusanos» de internet, de la fundación de Y Combinator (una de las empresas de capital de riesgo más importantes del Silicon Valley)… y de las TPUs de Google. Se dice pronto. Y el Craig Mundie que sale justo después lideró la investigación y estrategia de Microsoft de 2006 a 2012.

Y para cerrar, dos episodios dedicados a los ordenadores y sus usos militares, otro tema del que seguimos hablando hoy. En esta primera parte se repasa la larga historia de esos usos militares, ya desde el ENIAC, y el enorme papel de DARPA en la investigación en el campo. No os perdáis las demostraciones de simuladores de vuelo de altísima tecnología que palidecen al compararlas con lo que podemos correr hoy en básicamente cualquier PC. Y la investigación en armas autónomas ya había comenzado, con el eventual premio Turing Raj Reddy.

Y en la segunda parte, sistemas informatizados en los barcos de la Armada de los Estados Unidos, capaces de disparar autónomamente (y el rechazo que provocaba la idea entre al menos parte de los militares de la época), o las «star wars» de Reagan… y menciones a coches autónomos.

Os diría que avanzaseis los deberes para anticipar el mes que viene… pero marzo fue el último mes de la temporada 85-86 del programa, y la siguiente temporada no llegaría hasta septiembre.

Y eso es todo por marzo… de 1986. El mes que viene, más.

Lecturas (2026.I)

2025 se quedó en apenas dos entradas de lecturas. A ver si este 2026 nos da más juego (aunque no pinta especialmente bien la cosa). De momento, comenzamos con dos obras leídas íntegramente en 2025 y con una que cerré a uno de enero…

Uno diría que este libro es una novelación de la historia de Ramanujan, probablemente el más famoso matemático del siglo XX, contada por Hardy, uno de los matemáticos más famosos de, al menos, la primera mitad del siglo XX, pero la cosa se va algo más allá. La Wikipedia, de hecho, sitúa al autor, David Leavitt, dentro de la literatura gay. Aunque algo de eso tiene, creo que enmarcar la novela ahí dentro sería limitarla. Sin ser una gran novela, creo que retrata bien a los matemáticos, y de regalo tiene un punto de costumbrismo de la Inglaterra de la época de la Primera Guerra Mundial. Leavitt, por cierto, también es autor de la biografía (esta vez no novelada) Alan Turing. El hombre que sabía demasiado: Alan Turing y la invención de la computadora.

Se agradece mucho, por cierto, que el autor haya tenido la delicadeza de incluir un capítulo de fuentes al final del libro en el que aclara qué es ficción y qué es fruto de sus investigaciones y lecturas.

Javier Rodríguez dibuja (dibujar limita un poco lo que hace este hombre, que es la leche), yo tiendo a comprar, incluso desde que se pasó a DC (uno ha sido siempre muy Marvel, qué le vamos a hacer). Tengo la reedición de Miedo en la pila de pendientes y, una vez metidos en «pijameo»1 puedo recomendar, sobre todo, su maravillosa Spider-Woman (también pasó por la serie Natacha Bustos, otra crac del dibujo; no os perdáis su Moon Girl (y cuando digo «su» quiero decir que es una de las dos creadoras del personaje)) y su History of the Marvel Universe. Pocos creadores hay que tengan su talento para componer visualmente.

Ursula K. Le Guin ganó el Hugo, el Nebula y el Focus con esta novela de 1974, que es una absoluta maravilla. La historia se centra en un físico heredero de Einstein en un futuro muy lejano, y es un ensayo político muy pero que muy interesante. Estremecedora en grado sumo.

Y segundo tebeo de la tanda. De Zerocalcare ya hablamos el año pasado y sigue en su línea con esta obra que, según RTVE, se publicó en Italia en 2017, cuando Zero ya lo petaba por allí, pero en España apenas se había publicado una de sus obras. Zero sigue calcándose (perdón, sé que arderé en el infierno) a sí mismo en su línea pseudoautobiográfica de (para entonces) treintañero de éxito y sus conflictos para compatibilizar ese éxito con sus ideas políticas y con un entorno que no ha tenido tanta suerte en la vida, tirando de puntos de ¿realismo mágico?. Y por aquí no tenemos absolutamente ningún problema con que siga en ello :-).

Y cerramos este primer «lecturas» con Inteligencia artificial: jugar o romper la baraja, de Marga Padilla. El libro, por cierto, está disponible para descarga gratuita (recomiendan donar 3€) siguiendo el enlace. El libro es una muy buena explicación de lo que es la IA, cómo funciona y cuáles son sus potenciales beneficios y riesgos. En este mundo en que nos movemos, una parte del público lo acusará de tecnooptimista, y la otra de tecnopesimista. Servidor piensa que alcanza un equilibrio difícil de encontrar, y que es una muy buena introducción al tema para poder hablar con un mínimo de criterio.

Apa. Dentro de unas semanas (a saber cuántas), más.

  1. Pijameo: dícese de los tebeos de superhéroes, por lo de que los trajes de superhéroes son, de hecho, pijamas. ↩︎

No bloqueo la publicidad: bloqueo tu manera intrusiva de servirme publicidad

Me pone de los nervios cada vez que veo el texto de marras. Aquí, la última versión que me he encontrado:

Permite los anuncios para apoyar el periodismo

Parece que tienes un navegador, extensión, conexión o antivirus que bloquea la publicidad en nuestro sitio. La publicidad es la única forma de que nuestro trabajo sea posible. Desactiva el bloqueo de anuncios en [nombre del medio censurado]

Apreciado medio: mi navegador no bloquea la publicidad. Es más, entiendo que el uso de publicidad es un modelo como otro cualquiera para financiar un medio de comunicación. No quiero ni pensar la cantidad de publicidad que veo cada día en webs como la tuya. Pero lo que sí bloquea mi navegador son las intrusiones en mi privacidad. Porque no es de ley que no tengas ni el más mínimo interés en buscar alternativas mínimamente respetuosas con mi privacidad para servirme publicidad. Que las hay: insisto en que veo publicidad en la web día sí, día también. A veces, en medios en los que, además, pago una suscripción (este es un tema para otro día).

Añado: sí, me rindo en ocasiones ante este tipo de tácticas, y abro un navegador diferente, en modo privado, y acepto las cookies. Luego las cookies son exterminadas, igual que lo han sido anteriormente, y la publicidad que me sirves es aún menos efectiva que la que estoy dispuesto a ver sin amenazas estériles. Lose-lose, se le llama. Cambia de proveedor de publicidad a algo un pelín más ético. Eso sí sería apoyar el periodismo.

¿Qué uso para proteger mi privacidad? Lo primero es no usar Chrome: no usar el navegador de una empresa que obtiene algo así como el 90% de sus ingresos de la publicidad, quién lo iba a decir, hace maravillas por tu privacidad. Yo uso Firefox, pero con huir de Chrome, aunque sigas usando un navegador basado en Chromium, ya has dado un gran paso (y de entre los Chromiums, mi favorito es Vivaldi). Y, para completar, alguna extensión para controlar los intentos de seguirme más allá de lo razonable. Uno es de Ghostery, pero me dicen que Privacy Badger también funciona muy bien.


Otro texto sobre el tema: No uso nada cuyo propósito es bloquear la publicidad.

Ventanas partidas en Firefox

Me entero por este artículo de The Register de que hace ya un tiempo que se puede activar la vista partida en Firefox, que te permite ver dos páginas en una sola ventana del navegador:

Captura de pantalla de un navegador con dos pestañas abiertas en vista partida. A la izquierda vemos un editor de WordPress con esta ventana abierta, a la derecha este blog.

Que me diréis: ¿y no es lo mismo que abrir dos ventanas del navegador? Pues se parece, sí, pero te ahorras unos cuantos píxeles de la interfaz del navegador. Si, como yo, usas pestañas verticales (en mi caso, usando Sidebery), entonces la cosa pasa de ocupar toneladas de interfaz a algo viable:

Captura de pantalla con dos pestañas abiertas en modo partido. A la izquierda vemos la página de The Register mencionada en la entrada, a la derecha este blog. Además, a la izquierda, tenemos un porrón y medio de pestañas mostradas de manera verical ocupando una cantidad importante de píxels.

¿Que cómo se activa?

  • Parece que con Firefox 149 (que debería llegar dentro de tres semanas, el 24 de marzo) la funcionalidad estará disponible por defecto, pero mientras tanto, en Firefox 148 hay que, primero, ir a about:config, aceptar el botoncito que te avisa de que tocar cosas dentro de about:config tiene sus riesgos, buscar browser.tabs.splitView.enabled y activarlo (haciendo doble clic).
  • Y después de activarlo, es tan fácil como «control + clicar» una pestaña y la otra, y después, botón derecho y «Open in split view» (o como se diga en el idioma en que tengas configurado en Firefox)

A partir de ahí, podéis cambiar la cantidad de espacio asignada a cada pestaña arrastrando en la barra que las separa, y desconectarlas haciendo botón derecho en la barra de pestañas y «Separate split view«.

Y este ha sido el bricoconsejo de hoy…

Byte, feberero del 86

Seguimos con el proyecto mensual de ojear la revista Byte… con cuarenta años de retraso (tenéis todas las entradas sobre el tema, que ya son unas cuantas, en la etiqueta Byte de este blog). Y febrero del 86 se dedicaba… al procesado de textos (que, spoiler, no es lo mismo que los procesadores de texto).

Portada de la revista Byte de abril de 1986. El tema es el procesado de textos. La ilustración de portada es una placa de ordenador sobre la que flota la palabra TEXT

Y comenzamos mirando publicidad. El primer anuncio, diría yo, de un programita que seguimos usando cuarenta años más tarde: ¡Excel! Dice la Wikipedia que fue lanzado en septiembre del 85, y si vais a nuestra entrada del número de mayo del 85 (sí, llevamos ya un tiempín con esta historia de la revista Byte) encontraréis el anuncio de que lo iban a lanzar, y corregidme si me equivoco (ya podría ser, ya), pero no lo habíamos vuelto a ver por aquí.

Anuncio a doble página de Microsoft Excel. Vemos un ratón con un único botón y un diskette de tres pulgadas y media

Y si os ha llamado la atención el ratón monobotón, o el disquete de 3,5″… sí, Microsoft lanzó originalmente Excel solo para Mac.

No pongo captura, pero también merece la pena pararse en la sección de cartas (página 24 y siguientes), en que los lectores revisan el programa para calcular π (¡del número de mayo!) y explican lo lentísimo que es convergiendo (pero destacan que es muy legible y un buen ejemplo para aprender) y algunas correcciones al programa sobre la distribución normal (esta vez solo tenemos que retroceder hasta octubre). Bravo por los lectores atentos.

Seguimos, esta vez con nuestra manía de pararnos en cualquier cosa que tenga que ver con el Amiga. En este caso, se trata de una introducción al Kernel, el software de sistema contenido en su ROM, escrita nada más y nada menos que por su creador, el mítico (en círculos reducidos, cierto) RJ Mical. Si alguien quiere leer más sobre el tema, en el mismo Archive podéis encontrar su manual. #YaNoSeEscribeSoftwareAsí

Introduction to the Amiga ROM Kernel

A look inside the Amiga by the creator of Intuition

Editor's note: The first version of this article appeared on BIX (BYTE Information Exchange) on October 10, 1985.

This article introduces the building blocks of the Amiga ROM (read-only memory) Kernel software. I will examine the ROM Kernel including AmigaDOS and the disk-based libraries and devices, and present examples of translating code from other machines to the Amiga. Finally, I'll look at the hardware and special features of the ROM Kernel, describing how to use these directly in a system-integrated fashion. | Editor's note: For an overview of the Amiga from Commodore, see "The Amiga Personal Computer" by Gregg Williams, )on Edwards, and Phillip Robinson. August 1985 BYTE, page 83.)

System Overview

It is rare for software and hardware groups to work as closely together as we did at Amiga. We exchanged and debated ideas continuously during the creation of the Amiga. The close relationship influenced the design, bringing new features to the hardware and allowing the software to take full advantage of the hardware.

The Amiga's greatest strengths lie in its modularity and the interconnections among its system components, both hardware and software. The design teams designed and devel-

oped simultaneously and from the start they were intended to complement one another. Even though we designed the hardware pieces to fit tightly together, you can use any subset of the features without the necessity of controlling the entire machine. It's the same with the ROM software, where the pieces work closely together but each can stand alone.

The hardware and software combine efforts in many ways to achieve the Amiga's performance. For instance, the hardware includes a special coprocessor, the Copper, which synchronizes itself to the display position of the video beam without tying up the bus or the processor. The Copper can move data to one of the many hardware registers or it can cause a 68000 interrupt, which the Amiga's multitasking Exec (also known as Executive) then processes. This makes the Copper a powerful, unobtrusive auxiliary tool. It is used by the Graphics Support library for display-oriented changes and by the audio device for time-critical audio channel manipulations. You can use the Copper for time-critical operations because it's tied to the display, which is guaranteed to run at 60 Hz (the display processors start from the top of the screen 60 times a second).

The way the Amiga handles communications with its peripherals is another example of the union of hardware and software. The signals that pass between the Amiga and its peripherals are interrupt-driven. Peripherals, therefore, do not disturb the system or require monitoring until information needs to be communicated. The Amiga Exec works with the interrupt-driven communication by managing a complete interrupt-processing mechanism, providing a convenient, interleaved, prioritized processing of interrupts.

The multitasking Exec forms the core of the system software; it is a compact collection of routines that underlies the rest of the Amiga ROM software. The developers attempted to optimize the Exec for space, performance, clarity of usage, and the creation and management of lists, which are the primary components of Exec. All of the other pieces of the Exec are built on lists and, therefore, provide performance with a minimum of system overhead. You will be able to use even the more esoteric Exec functions once you learn the concept of the Exec list.

Exec is the starting point for all the other pieces of ROM software, mostly because it is the controller of tasks and interrupts. Each of the ROM , Kernel software components is designed to stand alone as much as possible; programmers can choose which components to use. But at the...

Y unas páginas más adelante nos encontramos un anuncio del Amiga que es un homenaje (merecidísimo) a Denise, Paula y Agnus, los tres chips especializados en vídeo, audio y gestión de memoria, revolucionarios para la época, que eran una de las partes vitales para hacer del Amiga la maravilla multimedia que era.

Anuncio del Amiga de Commodore. se muestran tres chips, y se presume de 4096 colores, sonido de cuatro canales estéreo, 32 instrumentos, 8 sprites, animaciíon en 3D, 25 canales DMA, un bit blitter y voces masculina y femenina

Y dejamos el Amiga (hasta que nos den la más mínima oportunidad de recuperar el tema 😅) y entramos en el tema del número, el procesado de textos. Hablando con la leyenda de la informática que es Donald Knuth (se lee Kanuz, por cierto), hoy profesor emérito de Stanford, creador de TeX y autor de la magna opus The Art of Computer Programming (in progress). Por aquella época ya hacía más de una década que le habían dado el premio Turing y en la entrevista, como no podría ser de otra forma dado el tema, hablan de tipografía digital y de la creación de Metafont, un software que se sigue usando hoy y que continúa siendo una [no tan] pequeña maravilla.

COMPUTER SCIENCE CONSIDERATIONS

CONDUCTED BY G. MICHAEL VOSE AND GREGG WILLIAMS

Donald Knuth speaks on his involvement with digital typography

Text processing as a computer science problem has consumed a major portion of the time and energy of Stanford professor Donald Knuth over the past eight years. Knuth authored and placed into the public domain a highly regarded typography system that he calls TeX {pronounced "tech"), along with a font creation language called METAFONT. \n conjunction with the completion of T^X, Knuth and Addison-Wesley are publishing a five-volume work entitled Computers and Typesetting. Volume I is The TeXbook, volume 2 is the source code for TeX, volume 3 is The METAFONT Book, volume 4 is the METAFONT source code, and volume 5 is Computer Modern Typefaces.

To discover what so intrigued Knuth about this subject. BYTE senior editors Gregg Williams and Mike Vose conducted the following interview with Professor Knuth at Addison-VJesley's offices in Reading, Massachusetts, on November II, 1985.

BYTE: Dr. Knuth. how did you become involved with digital typography and the publicdomain system known as Tj:X? Knuth: I got interested because I had written books and seen galley proofs, and suddenly computers were getting into the field of typesetting and the quality was going down.

Then I was working on a committee at Stanford planning an exam, and we got a hold of some drafts of Patrick Winston's book on artificial intelligence. We were looking at it to see if we should put it on the reading list for a comprehensive exam. It had just been brought in from Los Angeles where it had been done on a digital phototypesetter. This was the first time that I had ever seen digital type at high resolution. We had a cheap digital machine at Stanford that we thought of as a new toy. But never would I have associated it with printing a book that I'd be proud to own. Then I saw this type, and it looked as good as any I had ever seen done with metal. I knew that it was done just with zeroes and ones. I knew that it was bits. I could never, in my mind, ever, conceive of doing anything with lenses or with lead, metallurgy, and things like that. But zeroes and ones was different. I felt that I understood zeroes and ones as well as anybody! All it involved was getting the right zeroes and ones in place and I would have a machine that would do the books and solve all the quality problems. And, also, I could do it once and for all. I still had a few more volumes to write [of his seminal work. The Art of Computer Programming, a seven-volume series of which three volumes are finished] and

Y, para hacer más énfasis en lo que decía de que procesado de texto no se refiere a los procesadores de texto (al menos, no a los que nos vienen más rápidamente a la cabeza), nos podemos dar un chapuzón en cómo estaba por aquel entonces el estado del arte de la interpretación del lenguaje natural:

INTERPRETATION OF NATURAL LANGUAGE

by Jordan Pollack and David L Waltz

A potential application of parallelism

This article was adapted from "Parallel Interpretation of Natural language!' presented to the International Conference on Fifth Generation Computer Systems, November 1984.

THE INTERPRETATION of natural language requires the cooperative application of both language-specific knowledge about word use, word order, and phrase structure and realworld knowledge about typical situations, events, roles, contexts, and so on. While these areas of knowledge seem distinct, it isn't easy to write a program for natural-language processing that decomposes language into its parts; i.e., you cannot construct a psychologically realistic naturallanguage processor by merely conjoining various knowledge-specific processing modules serially or hierarchically.

We offer instead a model based on the integration of independent syntactic, semantic, and contextual knowledge sources via spreading activation and lateral inhibition links. Figure 1 shows part of the network that is activated with the sentence

John shot some bucks. (1)

Links with arrows are activating, while those with circles are inhibiting. Mutual inhibition links between two nodes allow only one of the nodes to remain active for any duration. (However, both nodes may be simultaneously inactive.) Mutual inhibition links are generally placed between nodes that represent mutually incompatible interpretations, while mutual activation links join compatible ones. If the context in which this sentence occurs has included a reference to "gambling." only the shaded nodes of figure la remain active after relaxation of the network. But if "hunting" has been primed, only the shaded nodes shown in figure lb will remain active. Notice that the "decision" made by the system integrates syntactic, semantic, and contextual knowledge: The fact that "some bucks" is a legal noun phrase is a factor in killing the readings of "bucks" as a verb; the fact that "hunting" is associated with both the "fire" meaning of "shot" and the "deer" meaning of "bucks" leads to the activation of the coalition of nodes shown in figure lb; and so on. At the same time, the knowledge base in our model is easy to add to or modify. In this model of processing, decisions are spread out over time, allowing various knowledge sources to be brought to bear on the elements of the interpretation process. This is a radical departure from cognitive models based on the convenient decision procedures provided by conventional programming languages.

Our program operates by dynamically constructing a graph with weighted nodes and links from a sentence while running an iterative operation that recomputes each node's activation level (or weight) based on a function of its current value and the inner product of its links---

(Como es costumbre de la casa, tanto Pollack como Waltz son no solo expertos, sino pioneros en la materia.)

Seguimos con el tema. Nos quejamos (con razón) de que artes y humanidades están excesivamente separadas en las cabezas de muchos, y de que esto es fuente de unos cuantos de nuestros problemas. En los ochenta ya era en gran parte así, no nos engañemos, pero de vez en cuando podíamos ver cosas como un artículo en una revista tecnológica dedicada al tema del procesado de… poesía.

POETRY PROCESSING

by Michael Newman

The concept of artistic freedom takes on new meaning when text processing handles the mundane tasks of prosody

For over a year, Michael Newman, Hillel Chiel (a researcher at Columbia Medical School), and Paul Holier (a programmer and analyst for PaineWebber) have been developing The Poetry Processor: Orpheus A-B-G The software is not yet commercially available, but we are pleased to share Michael Newman's thoughts on poetry processing and a module of Paul Holzer's code that shows off some of the new application's capabilities.

THE PROPERTIES OF a medium can have a decisive impact on the nature of what the medium conveys. Poetry began in an oral bardic tradition. It was newsy, folksy, evocative of the doings of great heroes. It had to be accessible to folk encountered at a roadside as well as pleasurable to more educated people met at court. There was no great emphasis on intricate forms, on how the poem looked on a page, because the page was not where the poem resided. The poem was voice-resident, ear-active. When Gutenberg invented movable type he did more than spring the Bible. His invention ultimately provided a watershed, an opportunity for the consolidation of language itself — and Shakespeare jumped on the opportunity. He reconfigured poetry, bringing together history, tragedy, and comedy under its roof. And, by casting poetry as theatre, he popularized it immensely.

Poetry in print became more permanent, less permutable; more visual, less aural. In this century, with the development of free verse, the poem has become almost a visual object, broken up and spread all over the page. There is even concrete poetry, which makes a fetish of typography.

Another world that makes a fetish of typography is software, specifically the largest part of software: word , processing. Software is about as permanent as print because you can always get a printout, but it is much more permutable. And, above all, it is interactive.

So what will be the impact of this revolutionary new medium on the oldest, most interactive, programmatic, musical, and image-provoking form of human speech? And what will be the impact of poetry on software?

Classical poetic forms— such as the sonnet, the villanelle, the sestina— are natural-language programs, algorithms. The sonnet is a set of instructions specifying 14 lines of iambic pentameter; a line of iambic pentameter contains five iambic units (feet). An iamb is a two-syllable unit with the accent on the second syllable.

Poetic algorithms have more in common with programming than their algorithmicness and use of powerful syntax. Poems involve iteration: Not only do iambs repeat and five-beat lines repeat, but ending-sounds repeat (rhyme in a sonnet), whole lines repeat (refrains and rhymes in a villanelle), words repeat (ending words in a sestina). Individual letters repeat in alliteration. This repetition is something poets count, and something poetry readers see and hear. If poets can count these things, so can a computer. If readers see and hear these things, so can the computer user— in an enhanced way.

Poems also involve two other cornerstones of computer science: recursion and conditionality. Every sonnet written refers to others of its kind. It...

No os perdáis, por favor, la discusión sobre cómo sacar la métrica de un poema automáticamente (en inglés, además, donde la cosa depende más de sílabas átonas y tónicas que en español):

Machine Reading of Metric Verse
by Paul Holzer

A computer can definitively scan a line of poetry for its stress pattern principally in one of two ways: (I) an algorithm can deduce the syllabic structure and the stressed syllables from analysis of the letters that make up the word, or (2) the computer can look up every word in a dictionary database that holds the syllabification and accentuation of every word. The lookup method requires a large database, and the algorithmic approach is complex and requires a deep analysis of English phonetics and spelling.

One of the features of a poetry processor is that the poet-user can specify the meter of every line of a poem (see photo A). For example, the string .-/.-/.-/.-/.-/ represents iambic pentameter. Dots (.) indicate an unstressed syllable and dashes (-) represent a stressed one. The slash (/) indicates the end of a foot, the basic metric unit. The first line of Shakespeare's Sonnet 18

shall I comPARE thee TO a SUMmer's DAY?

is an example of a line of iambic pentameter. The stressed syllables are in uppercase.

After writing a poem, users might request a metric scan of the poem. I will describe here a method fordoing this that is not based on one of the two general solutions I mentioned in the first paragraph. Instead, the processor will break each word into its syllables and then redisplay each line, with each syllable in uppercase or lowercase according to the position of the dots and dashes in a user-specified metric form. So. were Shakespeare trying to compose trochaic pentameter, with the metric pattern -./-./-./-./-./. the processor would reply with

SHALL i COMpare THEE to A sumMER'S day?

He would read this to himself, trying to put the stress on the uppercase syllables. Noting the rhythmic clumsiness, he might rewrite his line as follows:

To a summer's day I shall compare thee

and the processor would respond:

TO a SUMmer's DAY i SHALL comPARE thee.

Sounds better!

The main task for the computer is to break each word into its syllables. The algorithm is based on a systematic application of what appear to be the general rules by which English words break into syllables. Of course, there are no fixed rules, as evidenced by the fact that different dictionaries give different syllabifications for the same word.

The following is a simple version of the algorithm:

1. Break the word up into a sequence of alternating vowel and consonant groupings. Thus microcomputer becomes micro computer. Wherever there is a vowel or group of contiguous vowels, there will be a syllable. We need only assign the neighboring consonants to the syllable on the right or to the syllable on the left.

2. If the first vowel group has a consonant group to its left, then assimilate this consonant group to the vowel group. This leads, in our example, to microcomputer.

3. If the final vowel group has a consonant group to its right, then assimilate this consonant group to the vowel group. We now get microcomput er.

4. For the remaining unassigned consonants, do the following:

. a. If the consonant stands alone, attach it to the following vowel. Thus we get mi cr ocompu ter.

b. If there are two consonants, split them. We get mic ro com pu ter.

c. If there are three consonants, then i. If there is a doubled consonant, split the pair; thus apply becomes a ppl y and finally ap ply.

ii. If there is no doubled consonant, but the first of the three consonants is n, r, or [, then split between the second and third consonants.

iii. In all other cases, split between the first and second consonants.

Before applying this algorithm, however, we must preprocess the initial string of letters in order to take into account certain peculiarities of English orthography:

1. Final e is silent (with certain exceptions); treat it as a special consonant. Thus compute becomes compu te, then compute, and finally compute.

2. Translate many two-letter sequences into special single consonants, e.g.. sh, th, gu, qu. and ck.

3. Identify common suffixes. For example, the algorithm applied to blameless would yield blameless and then bla me less. However, when less is removed as a suffix, then the e in blame to thinking of the program as something for me to use— the relational table of contents was so the user could access my work. The program was originally to have been just a floppy solution to my table-of-contents dilemma. But you don't get that involved in a software application without elaborating and generalizing. In that way software is very much like'

poetic forms. You use it for the sake of using it. It generates its own kind of trance. Poetry and programming, once you look at them in context were just made for each other.

Marriages like this one, made in heaven, often are so because they are marriages of convenience. One of the impediments to formal verse writing is the inconvenience of having to

make repeated book accesses for rhymes, just when the form has prompted some involvement. You stop and look and lose something. That's one reason people have tried to do without forms. But that's throwing out the baby with the bathwater. You don't stop measuring and sounding things out, and you don't abandon would be recognized as silent, yielding blame less.

4. Identify some prefixes. For example, if en is recognized as a prefix, then enact becomes en act, rather than e nact.

It seems to be impossible to come up with a reasonably small set of rules and preprocessing steps to guarantee correct syllabification of all words. Two examples will illustrate some of the inherent difficulties:

1. Compound words: The algorithm will not detect the silent e in snake within the compound word snakebite unless the fragment bite is recognized as a word or treated as a suffix. Avoiding the problem would require either extensive word or prefix table lookups.

2. Successive vowels in different syllables: In reach, the ea is a single vowel sound, and the algorithm would treat it correctly. In react, we pronounce the e and a separately and the correct syllabification is react. Were the algorithm modified to isolate re as a prefix, it would treat react correctly, but turn reach into re ach.

Where ambiguities can arise, the best approach is to formulate a rule that leads to the smallest number of cases requiring table lookups for resolution. The present algorithm is not perfect, but it produces a readable, if not dictionary-perfect, syllabified word 95 percent of the time.

I have provided a Pascal program that implements the syllabification algorithm and illustrates how The Poetry Processor "reads" a user's poem according to a user-specified metric scheme. Editor's note: The Microsoft Pascal source code and executable version are available from BYTEnet Listings, telephone (617) 861-9764. as SCANPOEM.PAS and SCANPOEM.EXE. The executable version requires any MS-DOS or PC-DOS machine] To run the program, prepare two files. TESTPOE must contain the lines of poetry. You can write TEST.POE as a text file with each line of the poem on a separate line. A second text file. TESTFRM. should have a line containing a string of dots (.) and dashes (-) indicating the accentual scheme that each line of poetry is supposed to follow. Slashes indicating the end of a foot are optional.

As an example, a Shakespearean sonnet (iambic pentameter) will have a TESTFRM file consisting of 14 lines of .-/.-/.-/.-/.-/. Each line in TESTFRM must end with an asterisk. After editing the TESTFRM and TESTPOE files, you can run the program by entering its name, SCANPOEM. The computer will "read" the poem, printing in uppercase the appropriately stressed syllables.

Note that the program is a prototype version of the algorithm. It will not handle text with capital letters, apostrophes, or punctuation, so be careful not to include these features in TEST.POE. When using this demonstration program, you will undoubtedly find that some words are not properly syllabified.

Pero el colmo del friquismo, en serio, es un artículo entero dedicado a la sesudísima (solo hago un poco de broma, aquí) cuestión de si vale la pena aprender a teclear en un teclado Dvorak (#TLDR, los autores opinan que sí, si te puedes permitir el lujo de escribir siempre en un teclado Dvorak). Que el primer firmante de la pieza sea profesor emérito… de física, dedicado a la astronomía forense, es solo la guinda del pastel.

¿Había dicho yo que volveríamos al tema Amiga a la que nos dieran una oportunidad? Sí, ¿verdad? Aquí, los orígenes británicos de AmigaDOS:

Tripos—The Roots of AmigaDOS

Metacomco is the British company behind AmigaDOS

by Dick Pountain

A question that must be puzzling many people in U.S. computer circles is, "What is Metacomco?" When Commodore announced its spectacular Amiga computer, much of the U.S. press failed to point out (and possibly did not know) that the advanced operating system AmigaDOS was in fact written by a small British software house called Metacomco. (For more information on the Amiga, see "The Amiga Personal Computer" by Gregg Williams, Jon Edwards, and Phillip Robinson, August 1985 BYTE, page 83.)

Metacomco is based in Bristol, England, a city that is beginning to rival Cambridge as our potential computing capital (it also houses TDI-Pinnacle, INMOS, and others). Metacomco was founded in 1981 by Derek Budge and Bill Meakin and now employs ' about 2 5 people, mainly programmers and other technical staff.

The company's first product was a portable BASIC interpreter written in BCPL, the forerunner of C, which is taught and used extensively at Cambridge University. This interpreter was ported to the 8086 processor and shortly afterward was sold to Digital Research Inc., which still markets its descendant as Personal BASIC. This U.S. link became very important to Metacomco, for the royalties provided a steady source of income during the crucial early years and helped the company establish an office in California, which kept Metacomco in touch with the U.S. computer scene.

In 1983 Dr. Tim King, a Cambridge computer scientist, was engaged by the company as a consultant, and Metacomco's emphasis switched to the 68000 processor, with which King had been working since the first samples came out in 1981. The company produced a series of development tools, also written in BCPL, including a fullscreen editor, a macro assembler, and a linking loader. At that time there was no clearly established standard operating system for the 68000, so the next step was to write one. Subsequently, Tripos was born.

The Tripos operating system was based on a multitasking kernel developed as a doctoral thesis project at Cambridge in 1976. ("Tripos" was the name given to the three-legged stools that students sat on in the old days when taking their examinations and has since become the colloquial name for the Cambridge final examinations.) King, then working at Bath University, took the kernel written for a DEC PDP-11 and made it into a full 3 2 -bit multitasking operating system for the Sage microcomputer (which was new at that time). Tripos is BCPLrbased in the same way that UNIX is C-based, and it has many innovative features that I will discuss.

Metacomco had also purchased the rights to Cambridge LISP, a powerful LISP interpreter/compiler originally developed for the IBM. 3 70 and then ported to the 68000 at Cambridge. Metacomco produced versions for the ill-fated CP/M 68K and then for Tripos. Reduce 3, a symbolic math system written in LISP, was added to produce a Sage-based workstation that was sold to research institutions in various countries. Customers included SORD in Japan and Bristol neighbor INMOS, who used BCPL, for the first stage of bootstrapping its Occam compiler onto the 68000, using Sage computers running Tripos.

In 1984. Tim King joined Metacomco fulltime as Research Director, and Sinclair Research launched the QL. Initially the QL lacked a serious software-development environment, and Metacomco was able to quickly port its development tools, including the BCPL compiler, to it. The company has since extended the range to include an ISO (International Organization for Standardization)-validated Pascal computer, and it markets these products directly, rather than via the manufacturer, largely by mail order.

November 1984 is the crucial date in the AmigaDOS story. Metacomco visited Amiga...

Y aún una página más con contenido Amiga, aunque aquí no sea el contenido lo que quiero destacar, sino el continente. Estamos en 1986, y el mundo comienza a conectarse digitalmente. Byte, de hecho, tiene su propio servicio online, BIX (el Byte Information Exchange), que se había puesto en marcha en junio (a seis dólares de la época la hora de conexión)… pero la audiencia era tan corta (dice la Wikipedia que en el 87 llegaron a 17,000 usuarios) que la revista le daba bombo al servicio destacando un «Best of BIX» en sus páginas. Igual sí hemos cambiado un poco, en estos cuarenta años…

Best of BIX

AMIGA

Commodore's introduction of the Amiga has produced a flurry of activity among professional developers and personal computer users within the Amiga conference. The summary this month includes discussion on cables, monitors, printers, and software fixes. One of the hottest topics in the Amiga conference is on the subject of improving the performance of the Amiga by removing the 68000 and replacing it with a 68010 or 68020.

68010/68020 Upgrade

amiga/amiga68000 #22

An Amiga conference member asked if he could just drop a 68010 into the 68000 socket. This would give a 10 to 80 percent boost in performance! He had one, just sitting up to its bottom in black foam, on the shelf. But there were all these warnings about what would happen to his warranty if he opened the case.

amiga/amiga68000 #26, from rickross [Richard Ross, Eidetic Imaging]

M68010 works! A 68010 plugs directly into the Amiga and no problems were detected in the operation of the system software. Also, for everyone like me who has been trying to judge from the BYTE review photos, the microprocessor is socketed. The performance increase gained by the switch is not phenomenal, and no benchmarks are available, but it did run perceptibly faster. The M68020 has also been tried and seems to work as well.

amiga/amiga68000 #32

A BIX user provides the following:

The company that markets the 68020 piggyback board is Computer System Associates Inc., 7564 Trade St., San Diego, CA 92121, (619) 566-3911. The prices are:

Board only $ 575
Board plus 68020 975
Board plus 68020 and 68881 1480

For more information, contact Patricia Chouinard at the address above. I believe that 68000/68010 supervisor code that handles exceptions and certain other privileged functions will have to be modified. User code should work as is.

amiga/tech.talk #39

An Amiga owner describes his adventure in opening his computer and replacing the CPU:

You just got your Amiga and it's already the slow boy on the block, right? You can plug a 68010 into an Amiga (there goes my warranty) and it does go faster My Sieve benchmark is down to 5.8 seconds from 6.1.

Note: Your warranty will most likely be dead after you do this. Also, there is a lot of RFI shielding inside the Amiga. You get to undo a lot of screws, bend a couple of tabs, and pray a lot. If you aren't a tech type, don't even think about doing this yourself. The 68000 is socketed, but it is partially under the micro-disk drive, so you have to lift it from one end and kind of levitate out the other end (use of your CHI helps). Also, you only take out the screws in the deep wells on the bottom (five in all). Then there are four places where the top grabs the base at the four corners (there were already marks on mine from where it was put together, I guess). Once you have the top off there is a big surprise waiting for you... Another big surprise is that big RFI shield. Yes, it is a $#%+& to get off! There are screws on three sides and two tabs of metal to untwist. Once the shielding is out of the way, your first sight is of the WCS [writable control store] daughterboard. The custom chips and two parallel I/O chips are made with MOS technology.

The CPU is made by Motorola. The main board looks pretty much like the BYTE review photos. The boot ROMs are 27256s! This gives a 32K-byte by 16- bit boot ROM! What are you guys hiding in there? I could put a BASIC interpreter in that much space!

If you attempt to change your CPU, don't blame me if you muff it! If you don't know about how to make yourself static-free, you could really buy yourself some trouble of the worst kind.

Compatibility: I've run all of the Workbench demos. Everything seems fine, but I'm not making any promises. . .

amiga/tech.talk #41

The adventurous Amiga owner says that yes, his Amiga boots up, squeaks and everything! All the software he has runs and works great. The only potential problem at this point is how many times the MOVE SR.dest op code is used. This is the only active op-code difference. There is a whole host of new goodies, though, some that make a . desire for an MC68881 easier to satisfy.

amiga/tech.talk #43: a comment to 39

Another BIX subscriber replied that the upgrade produced only a 5 percent increase in throughput. Perhaps fortunate, because the descriptions of the hardware here have indicated that bus bandwidth consumption by the 68000 is low enough to allow other custom DMA chips to steal enough cycles to get their work done. It would appear that inserting a 68020 in the socket would require faster bimmers, etc.

amiga/tech.talk #44: a comment to 43

Wouldn't think just putting in a 68020 would affect DMA. Same clock speed. Or does the '20 do something different cycle-wise?

amiga/tech.talk #45: a comment to 44

The author of message 43 replied that the 68020 at the same clock speed will finish an instruction or series of instructions internal to the CPU in less time and start requesting the bus for some ROM or RAM access. He assumed that the DMA chips hold a higher bus priority, so the result will be that the 68020 will often be sitting there in idle awaiting the BUSACK signal. Waste of a 68020. Perhaps that explains why there is only a 5 percent 68010 edge over the 68000.

amiga/tech.talk #46: a comment to 45

Somebody said that the 68000 only uses every other clock cycle (for memory access, that is). The DMA hardware is fast enough to do four accesses during every clock cycle. Most of the DMA accesses the bus during periods when the 68000 doesn't. If the 68020 doesn't have these quiet periods then there could be problems.

amiga/tech.talk #47: a comment to 46

Actually, there is a counterargument to that, which is that the 68020, but not the 68010, has an instruction-only cache, which would mean...

Antes de cerrar la sección, quiero aprovechar para recoger el obituario de Robert Tinney en Ars Technica. ¿Quién es Robert Tinney? El ilustrador de muchas de las portadas de los números de Byte que hemos recogido por aquí, que falleció este primero de febrero. Que su obituario aparezca en Ars da una idea tanto de la relevancia de la revista como del impacto visual del trabajo de Tinney en muchísima gente. Curiosamente, estamos muy cerca de llegar a los números en que la revista dejó de emplear a Tinney para pasar a usar fotos en sus portadas, como podéis comprobar en los archivos de la revista Byte en archive.org, que también podéis usar, si queréis, para avanzaros y comprobar de qué va el número «del mes que viene». Añado que Tinney tenía una tienda, todavía activa (y espero que lo siga estando mucho tiempo), y que ahora mismo estoy peleando muy fuerte conmigo mismo para no comprarme pósters del número de artes digitales de 1982, la de abril del 85, o la de «claves de la educación» de, nada más y nada menos que julio de 1980.


Y seguimos también con el repaso a los episodios de febrero del 86 de Computer Chronicles

El primero de los episodios se dedica a operar en bolsa por ordenador, algo novedoso en la época. No me ha resultado especialmente interesante, más allá de los cacharritos para recibir información financiera vía radio FM, tanto en forma de cacharrito independiente como de accesorio para tu PC.

El segundo programa del mes va de «software psicológico», desde software para ayudar con determinadas terapias (con la sofisticación de la época, más cercana al programita con el que se juega para renovar el carnet de conducir) a tests de tipos diversos, con sus, inevitablemente, «módulos de inteligencia artificial»… y las mismas preocupaciones y las mismas salidas por la tangente que nos suenan tanto hoy.

(Y en los breves, noticias de la crisis de Commodore, que le debía doscientos millones de dólares a los bancos. La compañía no acabaría muriendo hasta el 94, pero ya comenzaba a oler a chamusquina la cosa.)

El tercer programa del mes se dedicaba al software para astronomía, tanto profesional como amateur (en este último caso, bastante reconocible para cualquiera que haya usado una app de astronomía únicamente… pero cuatro órdenes de magnitud menos potente e interfaces jurásicas). La discusión sobre astronomía «profesional»… lo de siempre: gente alucinando con lo que había avanzado la tecnología en el campo… que ahora nos parece casi de juguete.

(Y en los breves, la muerte de la mítica Osborne… cincuenta y tres millones de dólares de pérdidas de Commodore, por si los doscientos millones de deuda fuesen poca cosa… y la compra de Pixar por Steve Jobs por «varios millones de dólares».)

El 3×22, dedicado al color, lamentablemente, parece que está desaparecido. Como de costumbre, podéis chafardear lo que se viene en marzo tanto en la lista de episodios de la Wikipedia como en la playlist a la que pertenecen los vídeos de YouTube que tenéis aquí arriba.

Y con esto cerramos el mes. Dentro de unas semanas, más.