Sunday, 20 March 2016

1040 STE Revival! (with added joysticks)

There's no doubt about it, reviving my Mega ST has revived my Atari obsession too. I'm now up until the wee small hours every night browsing the Atari Forum and researching obscure bits of hardware and software!

The next step was, as I mentioned last time out, to rescue my 'original' STE - the one I grew up with - from storage.

STE Cleaning and RAM upgrade


First things first - it was *filthy* ! At least fifteen years of sitting near a window (before being stored away) also means that it's now very sunburned. Next to my Mega ST, which has obviously been kept away from sunlight in a studio for its whole life, the STE looks really yellowed.

Opening up the case, revealed decades worth of dust and dirt, so the first task was to give the upper part of the case a good wash and clean:




I'm not sure I'll go the whole hog and try to remove the yellowing, with hydrogen peroxide or 'Retr0Brite'. Although I've seen some positive reports, part of me quite likes the sunburned look, an indicator of a life well lived :)

A can of compressed air helped clean out the keyboard and the dust that had gotten past all the internal shielding to the motherboard. Much, much better!

Next, I decided to take advantage of the fact that the STE has easily removable RAM on SIMMs, unlike some other models which had the memory soldered directly to the motherboard. Replacement RAM is very cheaply available on eBay and elsewhere, so I decided to order the 4Mb upgrade. Now the STE will be able to handle some of the hard drive enabled games, networking tools and other software that requires 2Mb+  - ready for anything, in fact!


Post-RAM-upgrade STE sysinfo


Essential software

NVDI

One of the best pieces of software I've come across, for both the STE and the Mega, is also one that benefits from having more than a Megabyte of RAM. It's NVDI, a great graphics accelerator. Quite simply, it makes the whole system zip along like never before. Opening files and folders on the desktop really is noticeably faster, making it a real joy to use.

The latest version is 6, I think, but the v.5 download can be found on a number of websites and on the Atari Forum. It can be a little tricky to install from hard drive (it originally came on 3 floppy disks) and it does require a hard drive with a few megabytes of free space, but for GEM applications it's well worth it.

Super Boot III

https://sites.google.com/site/stessential/boot-managers/super-boot

Super Boot is an excellent boot manager for the ST. I found that I needed different boot configurations for different scenarios. Sometimes I would need NVDI, for example, but sometimes it took up too much memory. Sometimes I need the STinG networking stack and sometimes not. With Super Boot it's easy to pick and choose which components to load from the hard drive at boot time.

It's tricky to set up and get the hang of - there are lots of options, not all of which are clear. But there's a comprehensive readme file with the installer, which explains everything and tells you what files to put where.

I wouldn't be without it now and highly recommend it for hard drive owners.


Floppy imaging


With both STs set up and ready to go, my next task was to begin archiving the 250-odd floppy disks I had from back in the 90s. The aim was to preserve the data and to make it available for use in emulators. This is still very much an ongoing project, but I thought I'd mention a couple of tools which have proved essential:

JayMSA


JayMSA on my Mega


JayMSA works really well for converting standard ST disks with no errors to .ST and .MSA image formats. It's quick and easy to use - especially when used on a system with NVDI. (For some reason, without the graphics accelerator some of the fonts can be partially obscured.)

It tends to baulk at disks with non-standard geometry (extra tracks and/or sectors, usually), copy protected games and disks with sector errors. But for the most part, it's coped really well with most of the disks I've thrown at it.

I can usually image around 25 floppies before my data partition is full. That's when I hook up my PARCP-USB adapter and transfer them over to my Mac, and to Dropbox.


Pasti

http://pasti.fxatari.com/


The Pasti format (.STX) contains more metadata about the disk's geometry and layout. Unfortunately it's closed source and the imaging tool doesn't appear to be maintained any more, so it's not ideal. But it is pretty much the only solution for imaging the slightly 'odd' disks, copy protected games and disks with errors. So far it's only baulked at one of my disks - not too bad.

Joystick repairs


Another work in progress - something to keep me busy while the ST is chuntering away imaging disks.

I've dug out three old joysticks, only one of which was still working (strangely, the one with my sister's name on the box - yes, I was a notorious joystick-wrecker back in the day!)

They all needed a thoroughly good clean!

Opening up one of the non-working ones, I was able to clean the contacts on the internal micro switches and bring it back to life, relatively simply.


A photo posted by @alectronic on


This YouTube video was a great help, as was this wiki page, but luckily I didn't need to replace any micro switches (although a couple of the fire buttons are perhaps a bit worn, so could probably benefit from a new set). So, I now have two working Zip Stiks - anyone for a quick blast of Sensible Soccer?!

Sadly the third, a QuickJoy Turbo needs a bit of soldering doing, so that will have to wait.

It's great to get these cleaned and working again - especially given the eye-watering prices these are advertised for on eBay at the moment! (Ahm no' selling!)


Next up... I have some new gadgets...

Sunday, 13 March 2016

Mega ST : Revival pt.3

Getting my Mega ST up and running and onto my home network was an achievement in itself, but since then I've found a few interesting tools to use on the network

UIPTool


I've been really enjoying using my PARCP-USB adapter for transferring files to and from the ST. It's quick, easy and reliable. But I came across a tool which could be really useful in situations where connecting up to another computer with the USB cable isn't ideal.

It's called UIPTool, and it's an open source project, available from https://bitbucket.org/sqward/uip-tools

One of the great things about it is that it doesn't require the STinG networking stack (in my previous post I documented the tricky process of setting this up) You just run the program and it automatically detects the network card and requests DHCP from the router. Very easy!

Once up and running, you can access the ST via a web browser from any computer on the same LAN. Directories can be browsed and files can be copied to and from the ST with ease.




James Mackenzie has put together an excellent YouTube video which shows the whole process. He has a fancy wee NetUSBee adapter, but it's exactly the same process with my hulking great EtherNEC box :-)



The UPTool author appears to be continuing to add features such as a built in FTP server and static IP support. There's certainly lots of potential for this, so one to keep an eye on.

TelStar


Telstar is a Telnet client for the ST. It's available (TELSTAR.LZH) from a number of FTP sites, including this one: http://storage.atari-source.org/atari/mirrors/kurobox.serveftp.net/internet/

Using this, I was able to access the excellent Atari-hosted DarkForce Bulletin Board System (BBS) and relive lots of 90s memories!


A photo posted by @alectronic on


Next up


Getting the Mega ST up and running and interacting with other machines on my home network and out on the internet has inspired me to dig out my original, much loved 1040 STE from my Mum's house, as well as the hundreds of floppy discs stored away for safe-keeping, and rediscover some of the software and games from my youth.

I've also managed to get myself a CosmosEx, during one of the (very short) windows of opportunity. Once it arrives it should be a lot of fun...

Sunday, 14 February 2016

Mega ST : Revival pt.2

With a pristine, formatted 50Mb hard drive (see previous post), my first inclination was to fill it with games from the vast quantity that have been adapted to run from hard drives (and newer gadgets such as the UltraSatan or CosmoEx, neither of which I can quite afford at the moment, sadly)

But this urge had to be resisted (for now!)

Networking my Mega ST


First priority was to get my EtherNEC box working. The EtherNEC has been somewhat surpassed now by the NetUSBee, a neat little device that as well as an ethernet network connection gives a USB port so you can plug in a USB mouse. But, you can only work with what you have :) It seems the NetUSBee networking part uses the same drivers as EtherNEC, so it'll be the same process to set up if you have one of the newer devices.

The EtherNEC is a big black box containing an old ISA RTL8019AS ethernet card. It has a standard ethernet port on one end, and a ribbon cable extending from the side. On the end of this is an adapter that plugs into the Atari cartridge port. Thankfully it's clearly labelled so that you know which way round to plug it in.

EtherNEC - photo from http://www.atarikit.co.uk/ethernec/ethernec.html

As I was using the PARCP-USB adapter, my strategy was to download all the files necessary, get them unzipped and into the right places and edit the required config files on my Mac, before transfer to the ST.

I should probably also mention that my Mega has 4Mb of RAM - if you have less you may struggle, unfortunately.

STinG

First of all, I grabbed sfl.zip from Step Two here:
http://hardware.atari.org/sfl/
This is STinG, the ST networking stack. It's also available from the ST Essentials site, but the zip file above gave me just the files I needed (the full STinG package contains lots of other files which are unnecessary in this situation)

I extracted the following:

/XCONTROL.ACC
/CONTROL.INF

/AUTO/STING.PRG
/AUTO/STING.INF

/CPX/SERIAL.CPZ
/CPX/SERIAL.TXT
/CPX/STING.CPX
/CPX/STNGPORT.CPX
/CPX/STNGPROT.CPZ

/STING/CEN_PLEP.STY
/STING/CEN_PLIP.STY
/STING/DEFAULT.CFG
/STING/DIAL.INF
/STING/DIAL.RSC
/STING/DIAL.SCR
/STING/DIALER.APP
/STING/ICON.RSC
/STING/LCLTLK.STY
/STING/LOCAL.FEE
/STING/LOGIN.BAT
/STING/MASQUE.STY
/STING/MIDI.STY
/STING/RESOLVE.STX
/STING/ROUTE.TAB
/STING/SERIAL.STX
/STING/TCP.STX
/STING/UDP.STX

/STING/TOOLS/MASQLOOK.APP
/STING/TOOLS/M_MASTER.APP
/STING/TOOLS/PING.PRG
/STING/TOOLS/PING.RSC
/STING/TOOLS/SAVE_IP.GTP
/STING/TOOLS/SHUTDOWN.PRG
/STING/TOOLS/SYSINFO.TTP
/STING/TOOLS/TOOLS.TXT
/STING/TOOLS/TRACROUT.PRG
/STING/TOOLS/TRACROUT.RSC


/STING/DOCS/DIALER.TXT
/STING/DOCS/FEE.DOC
/STING/DOCS/KERN_UPD.TXT
/STING/DOCS/REM_CTRL.TXT
/STING/DOCS/RESOLVE.TXT
/STING/DOCS/SER_STX.TXT
/STING/DOCS/STING.HYP
/STING/DOCS/STING.REF
/STING/DOCS/TCP_STX.TXT
/STING/DOCS/UDP_STX.TXT
/STING/DOCS/UPDATE.TXT

EtherNEC driver

The next step was to grab the EtherNEC drivers (etherne.zip) from here:
http://home.arcor.de/thomas.redelberger/prj/atari/etherne/index.htm
There are a few files here and you need to pick the right one, depending on the processor and OS used.

My system is a stock 68000 processor with TOS 1.04, so the file I needed was enec.stx and this went into the STING folder.

Configuration files

There are a couple of configuration files to edit. I could have used Edith or similar on the ST itself, but for quickness I used my favourite text editor, Smultron (the free, open source version) on my Mac, before transfer.


First up is /STING/DEFAULT.CFG

Under NAMESERVER, I changed the line to point to the IP address of my router, so:

NAMESERVER  = 192.168.1.1

Everything else can stay as it is.


Next, it's /STING/ROUTE.TAB

This was the trickiest to get right. The lines to be changed are right at the bottom - all the rest is just comments.

The spaces between the IP addresses need to be tabs (only one and definitely a tab, not a space!)

I read some different opinions on what was required - some places specified two lines, but my configuration worked with only one (everything else should be commented out):


0.0.0.0 0.0.0.0 EtherNet 192.168.1.1

The last number there is, again, the IP address of my router.

Strangely, the comments in the file say that the second set of numbers should be the netmask (in my case 255.255.255.0 but this didn't work - 0.0.0.0 did)


XControl configuration

Once those files were all copied over, I rebooted into TOS and opened up the 'Control Panel' desk accessory from the Desk menu.

Here there are two new options - 'STinG Internals' and 'STinG Port Setup'. I didn't touch STinG Internals.

'STinG Port Setup' is where the EtherNEC card is selected and the IP address of the ST set.

There are two dropdown menus at the top. The lower one should always be set to 'EtherNet'. The top one toggles between 'General' and 'Addressing'.

Under 'General' I was able to select the NE2000 hardware (the network card inside EtherNEC), and the MAC address was automatically detected and displayed.

Under addressing, I entered a static IP address and the subnet mask 255.255.255.0  The default MTU figure of 1500 worked fine for me.


Router configuration

Because of my setup I had to add the MAC address and IP address to my router's list of static IP addresses. Your Mileage May Vary :)



Pinging!

A quick reboot and I was able to run PING.PRG from within the STING/TOOLS folder.

It only accepts numeric IP addresses, so I tested it with 8.8.4.4, which is Google's public DNS server (and easily memorable)

It worked!



( If only it had been this quick - it took many reboots and much editing of the config files before I got to this point - you have this handy guide, so your pain will be saved :)   )

Apps

This post is long enough already, so next time I'll describe a couple of the apps I've been using, now my Atari Mega is successfully connected up to the net.

Let me know if you've found this useful, or if you have any tips for networking on the ST :)


Thursday, 11 February 2016

Mega ST : Revival pt.1

This blog tends to randomly wake up and set off on new tangents, and here comes another one....

A few years ago I heroically saved an Atari Mega ST 4 from almost certain doom, as it was about to be disposed of from a huge uncaring bureaucratic institution where I happened to work. Getting it up and running and exploring what it's capable of in the 21st Century has been an interesting process, so the next few posts will describe what I've been up to, just in case it might be interesting and/or useful.

Growing up I cut my teeth (not literally, ouch) on an Atari STe, so I have a real fondness for the Atari platform. Letting the Mega go to waste would have been a crime! Plus, this machine came with a 50Mb Protar hard drive, which was the stuff of dreams back in the day.

Bringing it back to life was always going to be easier said than done. In the first rush of enthusiasm I cleaned it up, inside and out and it all appeared to be working fine. I bought an RGB to SCART adapter so I could hook it up to a modern(ish) TV, and invested in an EtherNEC box, which at that time appeared to offer the best hope for being able to easily transfer files to and from modern machines. However, this really required a hard drive - and my inherited hard drive had all of its partitions password protected and any people who might have originally known them were all long gone (not dead, just... not around any more)

So, my first attempts faltered. I had no way of accessing the hard drive and no drivers or tools to wipe it and start again. I found a few tools online that might have helped, but transferring them onto floppies was a painstaking process and when they didn't work my enthusiasm slowly ebbed away.

Fast forward to late 2015 and I decided it was about time to dig it out again and make a serious attempt at getting it all up and running.

Key to this process was my discovery, via AtariCrypt and the wonderful Atari-Forum, of Petr Stehlik's PARCP-USB device, which allows an ST to connect to a PC, Mac or Linux machine via USB and quickly transfer files back and forth.

After a long search this thread gave me the Protar drivers I needed for the ProGate 50DC. Using the PARCP-USB adapter I was able to transfer the utilities, update the driver and then format and re-partition the hard drive. I now had 50 whole megabytes of storage to play with! Unimaginable :-)

With that accomplished I had a few goals in mind. The first was to get the EtherNEC box working and find out whether a 30-year old Atari ST could connect to the internet successfully and do anything useful or interesting.

I also had a project in mind to back up a lot of old floppy disks that had been stored away since my STe days, before they succumb to bit-rot and the other perils of ageing magnetic media.

Plus, I also wanted to investigate a strange and intriguing expansion card that I found inside when I first opened up the Mega...

Over the next couple of posts I'll detail some of my experiences setting up the machine, getting it connected to the Internet, and beyond!

Monday, 19 May 2014

#ocTEL Into a routine

I've been settling into ocTEL and a routine of only dipping into what I need to (and can realistically achieve)

I realised quite quickly that as I'm at an early stage of my career in Learning Technology I don't have the same practice experience that others do, or much knowledge of methods and strategies and frameworks, but it's been really interesting to 'eavesdrop' on some of the discussions in the forums and elsewhere.

I think one of the tricky things to deal with (for me anyway) on an open and less structured course like this is the feeling of "being behind". With so much to read and posts continually coming in, I've had to constantly remind myself that it's ok to stop reading and take time to review what I've learned and think reflectively (and critically).

The 'badges' on offer are a great incentive, but at the same time I need to remind myself that not necessarily achieving every single one is ok too :)

Over the last week I enjoyed looking at the Pre-Course Questionnaires, and thinking critically about my own readiness and approach to learning.

I initially chose this questionnaire, as it seemed to be more in-depth than some, and offered more choices than just yes or no:

University of Houston - Distance Education
http://distance.uh.edu/online_learning.html

Although I scored highly on the computer skills areas, I ended up in the 'almost there' category, which I suppose was something of a surprise. Perhaps I was being cautious with my answers, or perhaps (more likely) I genuinely do need to work on preparing for any online learning I might do. Certainly it's been some time since I did any serious study myself, so I'm sure there are some "academic skills" I could do with brushing up on. So, in a way, that outcome has been a positive one, as it's made me take some time to reflect on this and work out how to improve these skills.

I like the idea that a pre-course questionnaire can be for the benefit of both the institution (to gain an insight into the capabilities and potential needs of students-to-be) but also for the potential student (to give them constructive suggestions and pointers about which areas they might need to pay extra attention)

I suppose what struck me most overall was that the questionnaires could easily have been tweaked (and only slightly!) to apply to 'traditional' on-campus courses. I wonder how many Colleges and Universties take into account things like IT literacy, independent study skills and time management skills when preparing to admit undergraduates or postgraduates to face to face courses?

Sunday, 4 May 2014

#ocTEL Week 0 thoughts

Over the next couple of months I'm going to attempt to flout all the basic laws of time and space by participating in the Open Course for Technology Enhanced Learning (ocTEL), run by the nice people at the Association for Learning Technology.

ocTEL logo

I participated in the second run of the EDCMOOC last November and although I enjoyed it, I found it hard to keep up with all the various discussions and diversions and groups that formed and morphed and spread out.

Perhaps I relied a bit too much on Twitter. It's very easy to get disheartened when everyone seems so far ahead in their thinking, and I think Twitter heightened this feeling for me. (For some reason, right from the get-go, EVERYONE seems to be much further advanced on Twitter!)

I found it interesting just how much my perception of my own progress and accomplishment relied on interaction and feedback from others. Getting a comment on a blog post seemed to validate it, somehow, whereas a post or a tweet that went seemingly unnoticed was really disheartening. Obviously in course with such a huge membership there's no way to guarantee interaction, but when the whole premise of the cMOOC relies on connectivity and peer-learning - and on each participant being brave and putting stuff out there into the big wide webby world - it's bound to be disheartening when silence is the result.

I did find myself considering giving up on the EDCMOOC before the last couple of weeks when I got my teeth into the practical element of creating the final assessment piece, and this spurred me on to complete the course.

This time, I hope that setting some learning goals will help - both to reduce the pressure I put on myself, but also to keep me focused on getting what I want out of the experience.

I also like the idea of forming small groups for reflection and discussion, in order to keep things manageable.

Activity 0.1 Big and Little questions

I'm not sure I have one 'Big Question' I'm looking to find answers or ideas about. I'm still at an early stage in my career in Learning Technology, so I feel I have lots of listening and reading to do. That might seem like a cop-out!

I suppose if I had to pick one area of particular interest, it would be around video and audio, and creating media content in and around TEL. That's quite a wide area, but by that I would include opportunities for collaborative work on such content, perhaps in innovative ways over the web - perhaps I'll blog about this later.

Thursday, 14 November 2013

#edcmooc The Teacher as Technologist?





The first two videos posted for us by the EDCMOOC folks this week come from corporate sources, showing how their products might shape the future of learning.

The first thing that sprang to my mind when watching them was the extent to which the film Minority Report continues to influence visions of future technology. It was released eleven years ago (really?!) and since then we've come some way down the path of touchscreens (tablets, smartphones) and movement-controlled devices (Kinect) but never really got to the point of those amazing, fluid, interactive 3D walls, allowing multiple users to call up and engage with any content (all beautifully rendered), seemingly from any and all possible sources.

It's interesting that tech companies are still inspired by parts of what was certainly a dystopian story. And they still see - and project - this vision for Human Computer Interaction in particular, as a goal that they (and society as a whole) should be striving for.

There's a great article about the lasting effects of Minority Report on the people who've shaped technology over the last decade, over at 'Overthinking It'. There's definitely an interesting debate to be had about whether popular culture predicts technological advancements or whether, conversely, technological advancement is inspired and driven by popular culture.

For me it's not so much a question of whether the visions Intel and Corning are setting out are utopian or dystopian - there are clearly positives and negatives to be taken from them both. My immediate reaction was about how the 'systems' they suggest could work, practically, away from the glossy, manufactured sheen of the advertising film, and what the implications might be if, to whatever extent, the realities fell short of the ideals.

Films - and advertising films especially - are, of course, by nature not balanced and objective. So we must think about the kinds of things these two companies are suggesting and try to work out whether they would actually be practical - or desirable - in the kinds of educational settings we are familiar with today.

One thing which struck me is the extent to which advances in technology could force teachers and academics to become technologists - experts in technology rather than simply experts in their particular field.

What is the fate for those who refuse to (or simply cannot) go along this path? And what knowledge and experience could learners miss out on because of this?

I'm familiar with some of the challenges posed by current online learning technology. Uploading text documents - never mind audio and video or complex multimedia content - to a relatively straight-forward web-based VLE system can be difficult for some. Much of the learning experience for the learner crucially depends on the degree of digital literacy of the teacher or academic.

Looking at the beautiful table-top and interactive wall displays in the Corning video, my thought was "Who is going to create and manage all this content?". Does this vision suggest that the teacher will in future need to be graphic designer, data architect, content curator, computer programmer?

The other question that struck me was how much freedom and creativity in teaching might actually be restrained by the use of such technology, in the sense that the tools available to the teacher might restrict what can and can't be taught, or the way in which that knowledge can be presented.

In this new 'system', which canonical sources are used for the core information, which websites can and can't be trawled for images of bridges, which tablets and software will be compatible? And who decides this kind of thing? The teacher or the company supplying and installing the technology?

The role of corporations in education is something that is often provokes controversy. The feeling that large companies might be gaining some measure of control over education at the expense of teachers and local authorities, however innocuous it might seem, would likely be resisted by many.

That's not to say that there aren't positives for me in all this. The possibilities for the use of 3D printing technology in the classroom are exciting. And the idea of better real-time connectivity between the bubble of the classroom and the people and places in the world outside certainly has potential.

But bringing it all together in a workable way - and for the good of the learner - is a big challenge.