Select Committee on Science and Technology Minutes of Evidence


Examination of Witnesses (Questions 380 - 399)

WEDNESDAY 24 JANUARY 2007

MR NICHOLAS BOHM, PROFESSOR IAN WALDEN AND MR PHIL JONES

  Q380  Lord Sutherland of Houndwood: But presumably they are in a position at the moment to draw the attention of the Commissioner's Office to such but they have not done so, so it would not be much of a step forward, would it?

  Professor Walden: Ofcom does not have a very clear remit in the area. It has many other things to do. Were it to be given a specific remit in this area, then I think it would obviously have to devote resource and manpower to tackling that topic.

  Q381  Lord Sutherland of Houndwood: Is that wishful thinking, or do you have evidence that that is how it would move?

  Professor Walden: Ofcom does have responsibility in respect of persistent misuse of networks, for example, under the Communications Act 2003 and within the scope of that were changes to be made to expand that remit somewhat then I would imagine Ofcom would take that responsibility. It has taken action against those who persistently misuse their electronic network.

  Q382  Lord Sutherland of Houndwood: Would it need additional powers to be able to enforce?

  Professor Walden: I do not believe it would. I think the existing legislation would be sufficient.

  Mr Jones: I agree, yes.

  Q383  Lord Young of Graffham: This is really for Professor Walden. In the Society's submission to us you are proposing that ISPs should be legally required to put minimum standards of technical protection measures in place, in other words protecting the centre rather than the end to end protecting in our machines. What sort of filtering or blocking do you really think would be appropriate?

  Professor Walden: At this point I would say that we are not actually saying the security measures need to be necessarily at the level of the ISP. We do not suggest that filtering or blocking at the level of the ISP is the solution, we are just saying that the ISP has a very good, close relationship with the user and therefore it seems an appropriate point of control. What I mean by that is the ISP can encourage and help the user to implement controls at the user's end. So it could be offering filtering and blocking software and the necessary training and implementation of that blocking software at the end user point, but it is the ISP who is in a good position to facilitate that implementation.

  Q384  Lord Young of Graffham: But you do point out in your submission that if you have an unprotected ISP, one which does not put anything in the centre, and I as a user do not take any precautions, not only can my machine be infected but I can start to infect other people on the network?

  Professor Walden: Yes. We say in the submission we could leave this completely to the free market and people could choose whether to go for the high grade, perhaps slightly higher cost, fully secured network or go for a bare Internet provision or access. In the area of security it is well known that obviously in network security you are only as secure as every node within the network and therefore if we do not consider the obligations of the end users and any obligations of ISPs we are potentially exposing ourselves.

  Q385  Lord Young of Graffham: I am a great advocate of the free market, but even I do not think that road crossings should be unregulated. I am quite prepared to put up with traffic lights, for example. What slightly concerns me is the ability of one careless user to infect many others. Do you not think there is a role for the centre, the ISP itself, to put in standards?

  Professor Walden: Yes. I think there is. I think there are concerns about their capabilities to do that. One of the recent debates is in respect of child abuse images and the Government's announcement last year that it is going to require all broadband access providers to filter child abuse images on the basis of a list promulgated by the Internet Watch Foundation.

  Q386  Lord Young of Graffham: Is there not a difference between regulating content and actually regulating viruses of any sort or other? Once you get into content you will be getting into very, very dangerous ground?

  Professor Walden: I think you are getting into very dangerous ground, although viruses are just a form of content, and whereas an ISP may filter for viruses, then there is the question of should they filter for unsolicited communications, spam, and should they filter for child abuse images? There is a slippery slope where once they get asked to filter and block for one thing they will be asked to filter and block for others. That is why I must put on record that the SCL does not suggest, and has not in its submissions, that filtering and blocking be implemented at an ISP level. What our submissions suggest is that the ISP must bear some responsibility and is a good partner with end users to improve the security measures which take place both at an end-user level and at an access level.

  Q387  Lord Sutherland of Houndwood: It may not be an apt comparison, but your Society is very happy to tolerate regulations about the sale of guns and alcohol and there is not a slippery slope that automatically follows from that. I would have thought viruses are in a different category from straightforward content.

  Professor Walden: I am Vice-Chairman of the Internet Watch Foundation and I am aware that the organisation does receive suicide websites, extreme pornography, which the Government is going to legislate on, and religious hatred, which it legislated on last year. We are seeing a growing concern about the content which is available over the Internet and that is where our concern lies. If you think there is a solution in blocking and filtering at an Internet Service Provider level, that could, by fair means or foul, be extended to a range of content which I think will fundamentally damage the Internet as an environment for the free exchange of information.

  Q388  Lord Sutherland of Houndwood: Could or would automatically?

  Professor Walden: Could.

  Q389  Lord Young of Graffham: It would allow you to filter politically, for instance?

  Professor Walden: Exactly.

  Lord Sutherland of Houndwood: I understand that, but in the same way the gun law was changed as a result of what happened in Dunblane, now, whether rightly or wrongly, most people accept that some sort of law is appropriate there in a way which is not appropriate for motorbikes and motorcars.

  Q390  Lord Young of Graffham: I am just wondering whether you could not legally distinguish between malware of any sort and viruses, whatever, and content, because I think one is quite clear. It is a virus which sets out to do harm one way or the other and the other is an expression of opinion, however distasteful you may find it. I would be very concerned if we started to get into free speech. There are some boundaries there, but I am also equally concerned that if the Internet is to flourish we have got to be able to be within a reasonably protected environment. I think when one careless user can start infecting other people we just have to be very careful, that is all, and I just wondered whether you were aware of any other technology or reason why the ISP could not be expected to actually put traffic lights into the centre of the network?

  Professor Walden: Yes. I think the nature of the Internet as a network of networks, the nature of the technology which underpins it, again taking child abuse images as an example, whereas we have seen a great growth in the availability of child abuse images via the Worldwide Web, we are now seeing that that is being replaced by the exchange of such images using peer to peer networking applications, which essentially would bypass blocking exercised at an ISP level. Therefore, people's ability to use this technology currently outstrips our ability to impose controls and we could focus all our attention on imposing obligations on the ISP and miss the target.

  Q391  Lord Young of Graffham: That is why I am not concerned with the content. I am looking at phishing and I am looking at the other ways in which people cause harm.

  Professor Walden: Yes. It is all data. It is all zeros and ones which go across the network, whether it is a virus, a child abuse image or a political statement. Our ability to distinguish at a network level the stream of data which is passing is difficult without perhaps capturing too much legitimate data or not capturing the illegitimate data.

  Q392  Chairman: Would you like to say a little more about that? How easy is it to distinguish a pornographic image from any other image?

  Professor Walden: The technology does exist. I do not know in terms of its percentage of reliability, but you are going to have a whole problem with false positives whereby, for example, one of the classic examples is that talking about Middlesex is going to cause problems with certain filters because the county ends in the word "sex".

  Q393  Chairman: I am concerned about images particularly, because this is something which has worried me for a long time. Ultimately, presumably somebody has to look at these images, do they?

  Professor Walden: The technology is certainly sophisticated enough, as far as I am aware, to distinguish image data and flesh tones within a particular message. I do not know the technological developments to extend it so that you can avoid such filtering software, but I am sure the clever people out there are making sure there are ways to get around such software.

  Lord Young of Graffham: I have a few years' experience of looking at red eye in photographs and there are enough false positives in that, and that is a very small thing. Pornography is in the eye of the beholder and it is a value judgment more than anything else, and that becomes extremely difficult, I would have thought.

  Q394  Earl of Erroll: What is your view of the idea that software manufacturers should be held liable for how well their software works, and particularly for security flaws in their software, rather than relying on the disclaimers and statements that it is up to the customer to decide whether it is fit for purpose?

  Mr Bohm: I wonder if I might respond on that one? I think there is a good deal of agreement that there is an incentives problem with insecurities in software, namely that the suppliers and the creators by and large do not suffer the adverse consequences to the same extent as their customers and therefore do not have adequate incentives to eliminate the flaws. I think the next step in this argument has not been as well explored as the first proposition. I very much agree with that proposition. I do not think things are entirely satisfactory in terms of incentives. I think that too much defective software has got out. The question of just exactly what you do, having reached that conclusion, is, not surprisingly, quite difficult. You can imagine an extreme approach, the motor vehicle approach, in which there is a set of construction and use regulations with which all software must comply before it is fit to be used on the Internet, and you have got to take your computer in for an MOT test and you have got to take a driving licence and get yourself re-licensed. A regime of that kind I am only setting up as a straw man, because it seems to me grossly disproportionate to the type of loss and injury caused by the defects we are talking about as compared with what motor vehicles do. Secondly, and possibly a more interesting drawback, is the fact that the speed of response of such a system of that kind to new developments and changes would be hopelessly slow. If you think of some of the zero day exploits in which a patch is released (or, as Microsoft like to call it, an update) and people disassemble it, work out what the problem was that it is trying to fix within a few hours and attack the computers of those who have not yet applied it, and you imagine a regulatory system attempting to keep pace in that environment, you will see why I do not think it is feasible, even if it were proportionate, which it probably is not anyway. So if you step away from regulation as a way of dealing with the incentive failure, you tend to fall back on saying, "Well, let's make them legally liable to their customers," or in an English law context, "Let's stop them contracting out of those liabilities," and to some extent we do hint at that because exemption clauses do face reasonableness tests, and so on, and we have tended to take that approach. Will it do enough good? I suggest that it will not do a lot of good for a number of different and independent reasons. One of them is that the people who suffer the loss are a long way down the chain. Somebody gets hit with a denial of service attack because out on the Internet 10,000 computers have had a weakness exploited but their owners do not realise they are being used in a concerted attack because somewhere out there there are crooks taking advantage of a weakness. The person injured is many, many steps down a chain—and there is certainly no contractual connection with a software supplier—in which third party actions intervene to cause it, and hoping that you can somehow incentivise the manufacturer to have eliminated those defects by enabling someone to sue is not very likely in that context because the people who have got the real incentive to sue are simply too far down the chain. The legal difficulties are almost certainly too great. Our attempts to stop people contracting out of unfair contract terms may not work very well because a lot of software is the subject of international supply contracts, to which those laws tend to either not apply or apply differently and are governed by foreign laws. So the legal mechanisms are not terribly good. The people who do have a position to sue are the customers who can sue PC World, but each of them suffers probably trivial losses and they are simply not going to take up their rights. They are not going to risk getting a costs award against them if they sue for tuppence ha'penny anyway. In other words, the risks and losses are diffused by the Internet and it is not an environment in which beefing up direct liability is an easy thing to do. It is very difficult to get it targeted right. If you get draconian with all the people who supply any software to anybody anywhere, you will impose terrible penalties on all sorts of small tinkerers in the open source community, who will be discouraged from contributing. So you will be reproached for stifling innovation and helping to lock in the monopoly position of large suppliers. I think it is an area fraught with difficulties. The one thing which I think you can do, and I think it harks back to earlier questions today, is defect notification, rather like breach notification. Defect notification is not particularly onerous. Everybody who does supply software on a commercial scale could be obliged to set up defect reporting procedures, into which all users can easily contract, so that if you acquire a bit of software the web page through which you acquire it has a button to tick which says, "Warn me of defects," and if those who then supply it are obliged to notify the defects promptly you do have something reasonably workable, reasonably consistent with what best practice requires at the moment, arguably, and reasonably useful for customers. You are also applying a significant incentive. The more reputation-based people's businesses are, the more significant it will be to compel them to own up to errors and defects and to provide remedies. So I think there are some pressures which could be applied, but I am cautious in this field because there are so many things you can try and make people do that would work badly and work adversely, but I think it is a field to move cautiously in even if you 100% accept, as I do, that the incentives are not entirely happy in their operation as they are.

  Q395  Earl of Erroll: I imagine the first part of your reply people use against the Sale of Goods Act and goods having to be suitable for the purpose for which they were intended, but I will not get drawn into that discussion! I know that the SCL suggested that perhaps software could be labelled with traffic lights to suggest how good it was for purpose. The trouble with that, presumably, is who does the testing, who does the certification, how do you make sure that it is all up to date and how do you deal with bugs after it is released?

  Professor Walden: Yes. I think the model we were thinking of is all telecommunications equipment has to be type approved. That type of approval process is essentially a self-certification process whereby manufacturers of Nokia, for example, do not have to submit their telephone to a third party for certification as meeting minimum standards, and those minimum standards include that it will not kill the user, it will not kill the network, it will not do those things. I think a self-certification scheme would be perfectly feasible. Clearly, the question of who sets the standards is a complex issue, but again industry would seem best placed to start down that road.

  Q396  Earl of Erroll: Yes. I tend to look at reviews, I must say, in magazines. What about what is almost a sell by date? We have been looking at ideas of whether software might be labelled to make sure it is sufficiently up to date when it is being sold. Do you see any legal problems with this requiring shops to report this when they are selling something to customers as to whether the software on it is sufficiently up to date or not?

  Professor Walden: I do not know if there are statistics in terms of how much software is now sold in shops as opposed to sold on-line.

  Q397  Earl of Erroll: Or on-line then?

  Professor Walden: Again, the idea goes back to an earlier question. In the UK certainly we do not generally criminalise negligence and if we do not criminalise it, if it just gives rise to some civil liability, the ability of individuals to take legal action under the English legal system is so expensive I am not sure it necessarily offers some benefit unless we can look to a regulator who acts as our surrogate. The Information Commission is not in that position. Ofcom could be in that position in certain circumstances in respect of those who provide Internet access. In the software industry there is no obvious party which would necessarily be well-placed to take action on our behalf.

  Q398  Earl of Erroll: Trading Standards can make sure the shops are selling stuff which is up to date and that you are not buying out of date stuff which is going to immediately leave your machine open to viruses when you buy a new laptop or something?

  Professor Walden: Sure. Were it to be for industry to set standards about when software is in date or out of date, then that would be a potential mechanism.

  Q399  Lord Young of Graffham: Could we not set standards so that when the first piece of software is used and accessed to the Internet it downloads any upgrades? In other words, make that automatic, because that would cover that point. So, in other words, however old the software was, once it was being used it would be upgraded?

  Professor Walden: Yes. I think the complexity of software and the complexity of how people use that software would potentially cause some problem with that system, but yes. With Microsoft Windows, for example, they have improved the way in which those updates are distributed and installed on people's machines.

  Lord Young of Graffham: It drives you mad because it happens every Tuesday!

  Chairman: Most software companies are doing that now, but it might be made a requirement. Let us move on because we want to have a chance to talk to you about spam.


 
previous page contents next page

House of Lords home page Parliament home page House of Commons home page search page enquiries index

© Parliamentary copyright 2007